Mar 18 12:10:13 crc systemd[1]: Starting Kubernetes Kubelet... Mar 18 12:10:13 crc restorecon[4688]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:13 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:10:14 crc restorecon[4688]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 12:10:14 crc restorecon[4688]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 18 12:10:14 crc kubenswrapper[4975]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 12:10:14 crc kubenswrapper[4975]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 12:10:14 crc kubenswrapper[4975]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 12:10:14 crc kubenswrapper[4975]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 12:10:14 crc kubenswrapper[4975]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 18 12:10:14 crc kubenswrapper[4975]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.729043 4975 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732565 4975 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732590 4975 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732596 4975 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732601 4975 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732606 4975 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732612 4975 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732617 4975 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732622 4975 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732628 4975 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732633 4975 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732637 4975 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732642 4975 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732647 4975 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732652 4975 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732656 4975 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732663 4975 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732669 4975 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732678 4975 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732683 4975 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732689 4975 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732693 4975 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732698 4975 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732703 4975 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732707 4975 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732712 4975 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732716 4975 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732720 4975 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732725 4975 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732729 4975 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732733 4975 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732738 4975 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732742 4975 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732747 4975 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732752 4975 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732756 4975 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732761 4975 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732767 4975 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732773 4975 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732778 4975 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732785 4975 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732790 4975 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732796 4975 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732800 4975 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732805 4975 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732810 4975 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732815 4975 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732820 4975 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732824 4975 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732829 4975 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732833 4975 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732838 4975 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732842 4975 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732847 4975 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732852 4975 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732858 4975 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732879 4975 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732885 4975 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732891 4975 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732896 4975 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732900 4975 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732905 4975 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732909 4975 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732914 4975 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732918 4975 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732923 4975 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732927 4975 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732931 4975 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732936 4975 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732940 4975 feature_gate.go:330] unrecognized feature gate: Example Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732945 4975 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.732949 4975 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735385 4975 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735418 4975 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735463 4975 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735472 4975 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735480 4975 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735486 4975 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735496 4975 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735503 4975 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735509 4975 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735514 4975 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735522 4975 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735528 4975 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735534 4975 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735540 4975 flags.go:64] FLAG: --cgroup-root="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735545 4975 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735550 4975 flags.go:64] FLAG: --client-ca-file="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735556 4975 flags.go:64] FLAG: --cloud-config="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735561 4975 flags.go:64] FLAG: --cloud-provider="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735566 4975 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735575 4975 flags.go:64] FLAG: --cluster-domain="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735580 4975 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735585 4975 flags.go:64] FLAG: --config-dir="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735590 4975 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735597 4975 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735604 4975 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735610 4975 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735615 4975 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735621 4975 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735627 4975 flags.go:64] FLAG: --contention-profiling="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735633 4975 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735638 4975 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735644 4975 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735649 4975 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735663 4975 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735671 4975 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735677 4975 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735682 4975 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735688 4975 flags.go:64] FLAG: --enable-server="true" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735694 4975 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735704 4975 flags.go:64] FLAG: --event-burst="100" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735710 4975 flags.go:64] FLAG: --event-qps="50" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735715 4975 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735720 4975 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735725 4975 flags.go:64] FLAG: --eviction-hard="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735732 4975 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735738 4975 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735744 4975 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735749 4975 flags.go:64] FLAG: --eviction-soft="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735754 4975 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735760 4975 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735765 4975 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735771 4975 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735776 4975 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735781 4975 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735786 4975 flags.go:64] FLAG: --feature-gates="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735793 4975 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735798 4975 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735805 4975 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735811 4975 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735816 4975 flags.go:64] FLAG: --healthz-port="10248" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735822 4975 flags.go:64] FLAG: --help="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735827 4975 flags.go:64] FLAG: --hostname-override="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735832 4975 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735838 4975 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735843 4975 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735851 4975 flags.go:64] FLAG: --image-credential-provider-config="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735857 4975 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735881 4975 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735886 4975 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735891 4975 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735897 4975 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735902 4975 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735907 4975 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735913 4975 flags.go:64] FLAG: --kube-reserved="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735918 4975 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735923 4975 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735929 4975 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735943 4975 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735949 4975 flags.go:64] FLAG: --lock-file="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735955 4975 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735960 4975 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735966 4975 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735983 4975 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735989 4975 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.735994 4975 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736000 4975 flags.go:64] FLAG: --logging-format="text" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736005 4975 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736011 4975 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736017 4975 flags.go:64] FLAG: --manifest-url="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736022 4975 flags.go:64] FLAG: --manifest-url-header="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736030 4975 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736036 4975 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736043 4975 flags.go:64] FLAG: --max-pods="110" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736048 4975 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736053 4975 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736058 4975 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736063 4975 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736070 4975 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736077 4975 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736083 4975 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736099 4975 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736104 4975 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736110 4975 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736114 4975 flags.go:64] FLAG: --pod-cidr="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736119 4975 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736131 4975 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736136 4975 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736141 4975 flags.go:64] FLAG: --pods-per-core="0" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736146 4975 flags.go:64] FLAG: --port="10250" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736152 4975 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736157 4975 flags.go:64] FLAG: --provider-id="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736162 4975 flags.go:64] FLAG: --qos-reserved="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736167 4975 flags.go:64] FLAG: --read-only-port="10255" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736173 4975 flags.go:64] FLAG: --register-node="true" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736178 4975 flags.go:64] FLAG: --register-schedulable="true" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736184 4975 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736195 4975 flags.go:64] FLAG: --registry-burst="10" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736200 4975 flags.go:64] FLAG: --registry-qps="5" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736214 4975 flags.go:64] FLAG: --reserved-cpus="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736219 4975 flags.go:64] FLAG: --reserved-memory="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736226 4975 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736231 4975 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736237 4975 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736242 4975 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736247 4975 flags.go:64] FLAG: --runonce="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736251 4975 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736256 4975 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736262 4975 flags.go:64] FLAG: --seccomp-default="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736267 4975 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736272 4975 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736278 4975 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736284 4975 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736290 4975 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736295 4975 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736300 4975 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736304 4975 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736309 4975 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736314 4975 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736320 4975 flags.go:64] FLAG: --system-cgroups="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736325 4975 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736333 4975 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736338 4975 flags.go:64] FLAG: --tls-cert-file="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736343 4975 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736351 4975 flags.go:64] FLAG: --tls-min-version="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736356 4975 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736361 4975 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736366 4975 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736372 4975 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736377 4975 flags.go:64] FLAG: --v="2" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736384 4975 flags.go:64] FLAG: --version="false" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736392 4975 flags.go:64] FLAG: --vmodule="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736398 4975 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736403 4975 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736541 4975 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736552 4975 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736558 4975 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736563 4975 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736568 4975 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736573 4975 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736579 4975 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736584 4975 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736588 4975 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736593 4975 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736598 4975 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736603 4975 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736608 4975 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736612 4975 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736617 4975 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736621 4975 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736626 4975 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736633 4975 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736639 4975 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736643 4975 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736648 4975 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736653 4975 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736658 4975 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736662 4975 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736668 4975 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736673 4975 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736678 4975 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736683 4975 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736688 4975 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736695 4975 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736700 4975 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736705 4975 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736710 4975 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736716 4975 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736721 4975 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736725 4975 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736730 4975 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736736 4975 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736741 4975 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736745 4975 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736750 4975 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736755 4975 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736759 4975 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736764 4975 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736768 4975 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736772 4975 feature_gate.go:330] unrecognized feature gate: Example Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736777 4975 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736782 4975 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736786 4975 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736790 4975 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736794 4975 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736798 4975 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736803 4975 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736807 4975 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736811 4975 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736815 4975 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736820 4975 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736824 4975 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736829 4975 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736836 4975 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736841 4975 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736848 4975 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736853 4975 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736858 4975 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736882 4975 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736886 4975 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736890 4975 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736895 4975 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736900 4975 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736905 4975 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.736909 4975 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.736917 4975 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.747678 4975 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.747750 4975 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747878 4975 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747891 4975 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747897 4975 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747902 4975 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747925 4975 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747929 4975 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747935 4975 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747939 4975 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747944 4975 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747949 4975 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747955 4975 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747960 4975 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747967 4975 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747972 4975 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747979 4975 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747986 4975 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.747996 4975 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748002 4975 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748041 4975 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748047 4975 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748053 4975 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748058 4975 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748063 4975 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748068 4975 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748073 4975 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748077 4975 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748081 4975 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748086 4975 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748090 4975 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748095 4975 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748100 4975 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748105 4975 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748109 4975 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748114 4975 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748121 4975 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748126 4975 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748131 4975 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748136 4975 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748140 4975 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748145 4975 feature_gate.go:330] unrecognized feature gate: Example Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748149 4975 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748154 4975 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748159 4975 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748163 4975 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748168 4975 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748172 4975 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748177 4975 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748182 4975 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748187 4975 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748193 4975 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748198 4975 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748203 4975 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748207 4975 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748211 4975 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748215 4975 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748219 4975 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748224 4975 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748228 4975 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748232 4975 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748236 4975 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748240 4975 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748244 4975 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748248 4975 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748254 4975 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748259 4975 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748263 4975 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748268 4975 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748273 4975 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748279 4975 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748284 4975 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748292 4975 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.748300 4975 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748479 4975 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748493 4975 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748498 4975 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748503 4975 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748508 4975 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748512 4975 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748516 4975 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748522 4975 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748533 4975 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748539 4975 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748543 4975 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748547 4975 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748552 4975 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748556 4975 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748562 4975 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748568 4975 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748573 4975 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748579 4975 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748583 4975 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748588 4975 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748593 4975 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748597 4975 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748602 4975 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748607 4975 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748611 4975 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748617 4975 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748623 4975 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748628 4975 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748633 4975 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748638 4975 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748644 4975 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748649 4975 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748654 4975 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748659 4975 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748663 4975 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748668 4975 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748672 4975 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748676 4975 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748681 4975 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748686 4975 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748691 4975 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748696 4975 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748701 4975 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748705 4975 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748709 4975 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748713 4975 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748718 4975 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748722 4975 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748726 4975 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748730 4975 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748734 4975 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748739 4975 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748744 4975 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748750 4975 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748755 4975 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748759 4975 feature_gate.go:330] unrecognized feature gate: Example Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748765 4975 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748769 4975 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748773 4975 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748778 4975 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748783 4975 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748787 4975 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748793 4975 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748798 4975 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748803 4975 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748807 4975 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748811 4975 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748815 4975 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748819 4975 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748824 4975 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.748830 4975 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.748839 4975 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.749187 4975 server.go:940] "Client rotation is on, will bootstrap in background" Mar 18 12:10:14 crc kubenswrapper[4975]: E0318 12:10:14.754941 4975 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.758734 4975 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.758910 4975 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.760556 4975 server.go:997] "Starting client certificate rotation" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.760590 4975 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.760811 4975 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.798603 4975 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 12:10:14 crc kubenswrapper[4975]: E0318 12:10:14.801534 4975 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.802027 4975 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.863590 4975 log.go:25] "Validated CRI v1 runtime API" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.903654 4975 log.go:25] "Validated CRI v1 image API" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.905649 4975 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.909562 4975 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-18-12-04-33-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.909596 4975 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.932243 4975 manager.go:217] Machine: {Timestamp:2026-03-18 12:10:14.929594901 +0000 UTC m=+0.643995520 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:885d92fc-8d43-4f95-a548-3a5e1645d68d BootID:7281f64d-d9d9-472a-a299-3ee193dcc38d Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:61:14:74 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:61:14:74 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ed:39:d9 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:fc:cd:cc Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:5b:8f:dc Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:68:7a:51 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:7e:e7:29:42:3b:25 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ee:6e:cf:e5:7a:d4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.932545 4975 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.932733 4975 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.933141 4975 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.933379 4975 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.933421 4975 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.933628 4975 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.933640 4975 container_manager_linux.go:303] "Creating device plugin manager" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.934264 4975 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.934300 4975 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.934526 4975 state_mem.go:36] "Initialized new in-memory state store" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.934614 4975 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.938331 4975 kubelet.go:418] "Attempting to sync node with API server" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.938379 4975 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.938414 4975 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.938430 4975 kubelet.go:324] "Adding apiserver pod source" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.938445 4975 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.942613 4975 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.944053 4975 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.944415 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.9:6443: connect: connection refused Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.944497 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.9:6443: connect: connection refused Mar 18 12:10:14 crc kubenswrapper[4975]: E0318 12:10:14.944593 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:10:14 crc kubenswrapper[4975]: E0318 12:10:14.944637 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.946366 4975 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.950240 4975 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.950306 4975 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.950327 4975 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.950345 4975 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.950373 4975 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.950390 4975 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.950408 4975 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.950435 4975 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.950454 4975 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.950472 4975 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.950495 4975 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.950512 4975 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.951432 4975 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.952337 4975 server.go:1280] "Started kubelet" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.952493 4975 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.953680 4975 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.954407 4975 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.954676 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.9:6443: connect: connection refused Mar 18 12:10:14 crc systemd[1]: Started Kubernetes Kubelet. Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.956199 4975 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.956262 4975 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.956519 4975 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.956533 4975 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 18 12:10:14 crc kubenswrapper[4975]: E0318 12:10:14.956564 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.956792 4975 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 18 12:10:14 crc kubenswrapper[4975]: E0318 12:10:14.957633 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.9:6443: connect: connection refused" interval="200ms" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.957709 4975 server.go:460] "Adding debug handlers to kubelet server" Mar 18 12:10:14 crc kubenswrapper[4975]: W0318 12:10:14.957677 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.9:6443: connect: connection refused Mar 18 12:10:14 crc kubenswrapper[4975]: E0318 12:10:14.957801 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:10:14 crc kubenswrapper[4975]: E0318 12:10:14.962331 4975 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.9:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189dee4688189909 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:14.952278281 +0000 UTC m=+0.666678900,LastTimestamp:2026-03-18 12:10:14.952278281 +0000 UTC m=+0.666678900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.966647 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.966829 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.966945 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.967011 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.967078 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.967141 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.967218 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.967279 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.967344 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.967400 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.967478 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.967602 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.967682 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.967743 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.967806 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.967960 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.968073 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.968165 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.968255 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.968333 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.968408 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.968484 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.968557 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.968617 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.968672 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.968725 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.968785 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.968848 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.968924 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.968982 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.969039 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.969094 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.969149 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.969215 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.969278 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.969337 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.969398 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.969470 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.969551 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.969635 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.969713 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.969776 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.969837 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.969918 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.969980 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.970035 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.970125 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.970221 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.970300 4975 factory.go:153] Registering CRI-O factory Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.970351 4975 factory.go:221] Registration of the crio container factory successfully Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.970301 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.970489 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.970573 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.970650 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.970734 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.970804 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.970885 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.970960 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.971032 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.971118 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.971196 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.971282 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.971358 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.971436 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.971516 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.971627 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.971701 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.971764 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.971834 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972022 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972103 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972162 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972218 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972272 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972325 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972380 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972455 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.970447 4975 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972566 4975 factory.go:55] Registering systemd factory Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972591 4975 factory.go:221] Registration of the systemd container factory successfully Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972623 4975 factory.go:103] Registering Raw factory Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972646 4975 manager.go:1196] Started watching for new ooms in manager Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972514 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972739 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972815 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972849 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972914 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972937 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972956 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.972976 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.973004 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.973032 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.974245 4975 manager.go:319] Starting recovery of all containers Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975617 4975 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975650 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975666 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975678 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975690 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975701 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975713 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975726 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975737 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975747 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975759 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975769 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975780 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975791 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975801 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975811 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975821 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975831 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975842 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975853 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975895 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975911 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975923 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975935 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975946 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975967 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975979 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.975991 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976001 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976012 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976023 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976033 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976043 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976053 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976063 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976072 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976082 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976092 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976101 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976110 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976120 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976129 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976138 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976148 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976159 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976169 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976179 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976190 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976200 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976209 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976220 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976230 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976239 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976250 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976260 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976272 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976284 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976335 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976346 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976357 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976366 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976377 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976390 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976404 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976417 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976430 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976442 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976456 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976468 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976481 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976492 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976501 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976510 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976519 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976528 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976538 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976547 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976556 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976564 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976574 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976583 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.976591 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977182 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977229 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977248 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977271 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977302 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977331 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977357 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977379 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977402 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977423 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977447 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977470 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977488 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977504 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977520 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977549 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977567 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977585 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977601 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977619 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977636 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977655 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977672 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977689 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977707 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977724 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977740 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977755 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977772 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977790 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977808 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.977835 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.986571 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.986627 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.986648 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.986670 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.986688 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.986706 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.986729 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.986750 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.986768 4975 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.986785 4975 reconstruct.go:97] "Volume reconstruction finished" Mar 18 12:10:14 crc kubenswrapper[4975]: I0318 12:10:14.986799 4975 reconciler.go:26] "Reconciler: start to sync state" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.005306 4975 manager.go:324] Recovery completed Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.012007 4975 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.014328 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.015076 4975 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.015153 4975 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.015193 4975 kubelet.go:2335] "Starting kubelet main sync loop" Mar 18 12:10:15 crc kubenswrapper[4975]: E0318 12:10:15.015254 4975 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.016542 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.016591 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.016607 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:15 crc kubenswrapper[4975]: W0318 12:10:15.017349 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.9:6443: connect: connection refused Mar 18 12:10:15 crc kubenswrapper[4975]: E0318 12:10:15.017490 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.017604 4975 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.017623 4975 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.017647 4975 state_mem.go:36] "Initialized new in-memory state store" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.035473 4975 policy_none.go:49] "None policy: Start" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.036336 4975 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.036365 4975 state_mem.go:35] "Initializing new in-memory state store" Mar 18 12:10:15 crc kubenswrapper[4975]: E0318 12:10:15.057251 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.089957 4975 manager.go:334] "Starting Device Plugin manager" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.090133 4975 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.090155 4975 server.go:79] "Starting device plugin registration server" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.090633 4975 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.090651 4975 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.091279 4975 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.091384 4975 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.091399 4975 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 12:10:15 crc kubenswrapper[4975]: E0318 12:10:15.099081 4975 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.116267 4975 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.116372 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.117330 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.117368 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.117380 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.117490 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.117755 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.117814 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.118218 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.118244 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.118254 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.118366 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.118499 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.118544 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.118562 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.118581 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.118591 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.118977 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.118997 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.119006 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.119077 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.119248 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.119279 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.119301 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.119313 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.119302 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.120005 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.120043 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.120086 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.120095 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.120111 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.120121 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.120213 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.120360 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.120402 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.120907 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.120933 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.120940 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.121085 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.121113 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.121337 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.121385 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.121397 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.121638 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.121664 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.121675 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:15 crc kubenswrapper[4975]: E0318 12:10:15.158513 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.9:6443: connect: connection refused" interval="400ms" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.189237 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.189332 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.189354 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.189371 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.189388 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.189426 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.189457 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.189477 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.189516 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.189543 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.189558 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.189591 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.189616 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.189635 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.189654 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.190917 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.193003 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.193066 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.193080 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.193142 4975 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:15 crc kubenswrapper[4975]: E0318 12:10:15.193737 4975 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.9:6443: connect: connection refused" node="crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291573 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291631 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291654 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291679 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291699 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291758 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291782 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291802 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291823 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291843 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291848 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291862 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291911 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291930 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291938 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291952 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291964 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291974 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291959 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.292074 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291801 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.292023 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.292049 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.292064 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.292095 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.292140 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.291999 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.292226 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.292237 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.292281 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.394043 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.396050 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.396095 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.396111 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.396143 4975 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:15 crc kubenswrapper[4975]: E0318 12:10:15.396529 4975 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.9:6443: connect: connection refused" node="crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.446082 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.450384 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.472815 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.488273 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: W0318 12:10:15.493410 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-20b9b538c0ee7e53955d9e93880f35162eea8ab97cde985d9cdc27b911290a7b WatchSource:0}: Error finding container 20b9b538c0ee7e53955d9e93880f35162eea8ab97cde985d9cdc27b911290a7b: Status 404 returned error can't find the container with id 20b9b538c0ee7e53955d9e93880f35162eea8ab97cde985d9cdc27b911290a7b Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.493932 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:10:15 crc kubenswrapper[4975]: W0318 12:10:15.498518 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-3a7b98d367ef8fb6a23bd0387767b9dcae59678c9734e99aa5b8b13b7ea82597 WatchSource:0}: Error finding container 3a7b98d367ef8fb6a23bd0387767b9dcae59678c9734e99aa5b8b13b7ea82597: Status 404 returned error can't find the container with id 3a7b98d367ef8fb6a23bd0387767b9dcae59678c9734e99aa5b8b13b7ea82597 Mar 18 12:10:15 crc kubenswrapper[4975]: W0318 12:10:15.502525 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e6d60322c62184b0a85f93cfc34f6ac06a975286846e7759a57e390901bbf18e WatchSource:0}: Error finding container e6d60322c62184b0a85f93cfc34f6ac06a975286846e7759a57e390901bbf18e: Status 404 returned error can't find the container with id e6d60322c62184b0a85f93cfc34f6ac06a975286846e7759a57e390901bbf18e Mar 18 12:10:15 crc kubenswrapper[4975]: W0318 12:10:15.508856 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-6726499f35f33eaa7701aac264685e2a0b0d641bbf09ea90d40cc109ab4bb6dd WatchSource:0}: Error finding container 6726499f35f33eaa7701aac264685e2a0b0d641bbf09ea90d40cc109ab4bb6dd: Status 404 returned error can't find the container with id 6726499f35f33eaa7701aac264685e2a0b0d641bbf09ea90d40cc109ab4bb6dd Mar 18 12:10:15 crc kubenswrapper[4975]: W0318 12:10:15.510602 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-15320226d952ec0cc4374a9d663ff28dec7f8abbb5f768b0754cc9e9ffac1f3a WatchSource:0}: Error finding container 15320226d952ec0cc4374a9d663ff28dec7f8abbb5f768b0754cc9e9ffac1f3a: Status 404 returned error can't find the container with id 15320226d952ec0cc4374a9d663ff28dec7f8abbb5f768b0754cc9e9ffac1f3a Mar 18 12:10:15 crc kubenswrapper[4975]: E0318 12:10:15.560434 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.9:6443: connect: connection refused" interval="800ms" Mar 18 12:10:15 crc kubenswrapper[4975]: W0318 12:10:15.768133 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.9:6443: connect: connection refused Mar 18 12:10:15 crc kubenswrapper[4975]: E0318 12:10:15.768214 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.797527 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.799068 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.799103 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.799116 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.799139 4975 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:15 crc kubenswrapper[4975]: E0318 12:10:15.799432 4975 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.9:6443: connect: connection refused" node="crc" Mar 18 12:10:15 crc kubenswrapper[4975]: W0318 12:10:15.925004 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.9:6443: connect: connection refused Mar 18 12:10:15 crc kubenswrapper[4975]: E0318 12:10:15.925098 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:10:15 crc kubenswrapper[4975]: I0318 12:10:15.955377 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.9:6443: connect: connection refused Mar 18 12:10:16 crc kubenswrapper[4975]: I0318 12:10:16.019916 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"15320226d952ec0cc4374a9d663ff28dec7f8abbb5f768b0754cc9e9ffac1f3a"} Mar 18 12:10:16 crc kubenswrapper[4975]: I0318 12:10:16.021435 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6726499f35f33eaa7701aac264685e2a0b0d641bbf09ea90d40cc109ab4bb6dd"} Mar 18 12:10:16 crc kubenswrapper[4975]: I0318 12:10:16.022552 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e6d60322c62184b0a85f93cfc34f6ac06a975286846e7759a57e390901bbf18e"} Mar 18 12:10:16 crc kubenswrapper[4975]: I0318 12:10:16.023777 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3a7b98d367ef8fb6a23bd0387767b9dcae59678c9734e99aa5b8b13b7ea82597"} Mar 18 12:10:16 crc kubenswrapper[4975]: I0318 12:10:16.024715 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"20b9b538c0ee7e53955d9e93880f35162eea8ab97cde985d9cdc27b911290a7b"} Mar 18 12:10:16 crc kubenswrapper[4975]: W0318 12:10:16.237747 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.9:6443: connect: connection refused Mar 18 12:10:16 crc kubenswrapper[4975]: E0318 12:10:16.237819 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:10:16 crc kubenswrapper[4975]: E0318 12:10:16.360958 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.9:6443: connect: connection refused" interval="1.6s" Mar 18 12:10:16 crc kubenswrapper[4975]: W0318 12:10:16.445346 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.9:6443: connect: connection refused Mar 18 12:10:16 crc kubenswrapper[4975]: E0318 12:10:16.445482 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:10:16 crc kubenswrapper[4975]: I0318 12:10:16.600406 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:16 crc kubenswrapper[4975]: I0318 12:10:16.602294 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:16 crc kubenswrapper[4975]: I0318 12:10:16.602336 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:16 crc kubenswrapper[4975]: I0318 12:10:16.602346 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:16 crc kubenswrapper[4975]: I0318 12:10:16.602378 4975 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:16 crc kubenswrapper[4975]: E0318 12:10:16.602962 4975 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.9:6443: connect: connection refused" node="crc" Mar 18 12:10:16 crc kubenswrapper[4975]: I0318 12:10:16.904768 4975 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:10:16 crc kubenswrapper[4975]: E0318 12:10:16.906143 4975 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:10:16 crc kubenswrapper[4975]: I0318 12:10:16.955795 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.9:6443: connect: connection refused Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.029302 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fffbb26bf332dce194e9fe134fc0e9eccf21c6c833e26e4c7aaaed7d211ae0cc"} Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.029348 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"00a28602986a572f19d3fe21e7033ff58efe255822d5bfd421c86004edb5693d"} Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.029359 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46"} Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.029369 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a"} Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.029398 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.030130 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.030157 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.030166 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.030827 4975 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df" exitCode=0 Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.030893 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.030895 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df"} Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.031589 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.031618 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.031629 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.058421 4975 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e" exitCode=0 Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.058536 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.058539 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e"} Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.059384 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.059412 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.059423 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.059990 4975 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c" exitCode=0 Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.060049 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c"} Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.060092 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.060581 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.061304 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.061329 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.061337 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.061498 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.061525 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.061537 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.064106 4975 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163" exitCode=0 Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.064140 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163"} Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.064182 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.065009 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.065029 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.065039 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:17 crc kubenswrapper[4975]: E0318 12:10:17.091807 4975 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.9:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189dee4688189909 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:14.952278281 +0000 UTC m=+0.666678900,LastTimestamp:2026-03-18 12:10:14.952278281 +0000 UTC m=+0.666678900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.455596 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:17 crc kubenswrapper[4975]: I0318 12:10:17.955536 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.9:6443: connect: connection refused Mar 18 12:10:18 crc kubenswrapper[4975]: E0318 12:10:18.004820 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.9:6443: connect: connection refused" interval="3.2s" Mar 18 12:10:18 crc kubenswrapper[4975]: W0318 12:10:18.035685 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.9:6443: connect: connection refused Mar 18 12:10:18 crc kubenswrapper[4975]: E0318 12:10:18.035768 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.074083 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10"} Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.074589 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42"} Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.074621 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a"} Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.074686 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.075693 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.075921 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.075934 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.078503 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482"} Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.078564 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4"} Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.078579 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a"} Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.078590 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6"} Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.081027 4975 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214" exitCode=0 Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.081118 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214"} Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.081138 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.081942 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.081982 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.081994 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.083731 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.084304 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.084375 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba"} Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.084634 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.084659 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.084670 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.084747 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.084763 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.084772 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:18 crc kubenswrapper[4975]: W0318 12:10:18.105737 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.9:6443: connect: connection refused Mar 18 12:10:18 crc kubenswrapper[4975]: E0318 12:10:18.105805 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.9:6443: connect: connection refused" logger="UnhandledError" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.203840 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.204953 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.204992 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.205003 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:18 crc kubenswrapper[4975]: I0318 12:10:18.205028 4975 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:18 crc kubenswrapper[4975]: E0318 12:10:18.205447 4975 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.9:6443: connect: connection refused" node="crc" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.090242 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"30dda0539eb29f8cab6f1f873631baa420466c4d55dc90132b49178d2134ef3a"} Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.090322 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.091102 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.091133 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.091143 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.092857 4975 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7" exitCode=0 Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.092948 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7"} Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.092960 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.093621 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.093645 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.093655 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.093753 4975 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.093784 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.093793 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.093806 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.095107 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.095127 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.095153 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.095167 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.095153 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.095135 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.095213 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.095228 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.095237 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.107299 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.115606 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:19 crc kubenswrapper[4975]: I0318 12:10:19.814990 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:20 crc kubenswrapper[4975]: I0318 12:10:20.098770 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93"} Mar 18 12:10:20 crc kubenswrapper[4975]: I0318 12:10:20.098853 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487"} Mar 18 12:10:20 crc kubenswrapper[4975]: I0318 12:10:20.098909 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:20 crc kubenswrapper[4975]: I0318 12:10:20.098947 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0"} Mar 18 12:10:20 crc kubenswrapper[4975]: I0318 12:10:20.098965 4975 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:10:20 crc kubenswrapper[4975]: I0318 12:10:20.099024 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:20 crc kubenswrapper[4975]: I0318 12:10:20.098972 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16"} Mar 18 12:10:20 crc kubenswrapper[4975]: I0318 12:10:20.100154 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:20 crc kubenswrapper[4975]: I0318 12:10:20.100208 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:20 crc kubenswrapper[4975]: I0318 12:10:20.100213 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:20 crc kubenswrapper[4975]: I0318 12:10:20.100253 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:20 crc kubenswrapper[4975]: I0318 12:10:20.100276 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:20 crc kubenswrapper[4975]: I0318 12:10:20.100223 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:20 crc kubenswrapper[4975]: I0318 12:10:20.111069 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.004385 4975 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.105830 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018"} Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.105912 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.105975 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.106031 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.106909 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.106937 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.106951 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.106961 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.106965 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.106978 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.107829 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.107887 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.107902 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.110442 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.405902 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.406930 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.406998 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.407007 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:21 crc kubenswrapper[4975]: I0318 12:10:21.407036 4975 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:22 crc kubenswrapper[4975]: I0318 12:10:22.108014 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:22 crc kubenswrapper[4975]: I0318 12:10:22.108066 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:22 crc kubenswrapper[4975]: I0318 12:10:22.108082 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:22 crc kubenswrapper[4975]: I0318 12:10:22.109181 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:22 crc kubenswrapper[4975]: I0318 12:10:22.109210 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:22 crc kubenswrapper[4975]: I0318 12:10:22.109218 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:22 crc kubenswrapper[4975]: I0318 12:10:22.109167 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:22 crc kubenswrapper[4975]: I0318 12:10:22.109301 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:22 crc kubenswrapper[4975]: I0318 12:10:22.109313 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:22 crc kubenswrapper[4975]: I0318 12:10:22.109368 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:22 crc kubenswrapper[4975]: I0318 12:10:22.109391 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:22 crc kubenswrapper[4975]: I0318 12:10:22.109403 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:22 crc kubenswrapper[4975]: I0318 12:10:22.137069 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 18 12:10:23 crc kubenswrapper[4975]: I0318 12:10:23.110182 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:23 crc kubenswrapper[4975]: I0318 12:10:23.111376 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:23 crc kubenswrapper[4975]: I0318 12:10:23.111426 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:23 crc kubenswrapper[4975]: I0318 12:10:23.111441 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:23 crc kubenswrapper[4975]: I0318 12:10:23.359705 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:23 crc kubenswrapper[4975]: I0318 12:10:23.359787 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:10:23 crc kubenswrapper[4975]: I0318 12:10:23.359938 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:23 crc kubenswrapper[4975]: I0318 12:10:23.360115 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:23 crc kubenswrapper[4975]: I0318 12:10:23.361245 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:23 crc kubenswrapper[4975]: I0318 12:10:23.361270 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:23 crc kubenswrapper[4975]: I0318 12:10:23.361279 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:23 crc kubenswrapper[4975]: I0318 12:10:23.361680 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:23 crc kubenswrapper[4975]: I0318 12:10:23.361717 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:23 crc kubenswrapper[4975]: I0318 12:10:23.361728 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:24 crc kubenswrapper[4975]: I0318 12:10:24.911147 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:24 crc kubenswrapper[4975]: I0318 12:10:24.911331 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:24 crc kubenswrapper[4975]: I0318 12:10:24.912459 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:24 crc kubenswrapper[4975]: I0318 12:10:24.912516 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:24 crc kubenswrapper[4975]: I0318 12:10:24.912529 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:25 crc kubenswrapper[4975]: E0318 12:10:25.099216 4975 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:27 crc kubenswrapper[4975]: I0318 12:10:27.459041 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:27 crc kubenswrapper[4975]: I0318 12:10:27.459180 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:27 crc kubenswrapper[4975]: I0318 12:10:27.460302 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:27 crc kubenswrapper[4975]: I0318 12:10:27.460362 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:27 crc kubenswrapper[4975]: I0318 12:10:27.460375 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:27 crc kubenswrapper[4975]: I0318 12:10:27.911717 4975 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 12:10:27 crc kubenswrapper[4975]: I0318 12:10:27.911849 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:10:28 crc kubenswrapper[4975]: W0318 12:10:28.835913 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 12:10:28 crc kubenswrapper[4975]: I0318 12:10:28.836062 4975 trace.go:236] Trace[613171415]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 12:10:18.834) (total time: 10001ms): Mar 18 12:10:28 crc kubenswrapper[4975]: Trace[613171415]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:10:28.835) Mar 18 12:10:28 crc kubenswrapper[4975]: Trace[613171415]: [10.001855602s] [10.001855602s] END Mar 18 12:10:28 crc kubenswrapper[4975]: E0318 12:10:28.836098 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 12:10:28 crc kubenswrapper[4975]: I0318 12:10:28.957143 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 18 12:10:29 crc kubenswrapper[4975]: W0318 12:10:29.042142 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.042244 4975 trace.go:236] Trace[1449600122]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 12:10:19.040) (total time: 10001ms): Mar 18 12:10:29 crc kubenswrapper[4975]: Trace[1449600122]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:10:29.042) Mar 18 12:10:29 crc kubenswrapper[4975]: Trace[1449600122]: [10.001548264s] [10.001548264s] END Mar 18 12:10:29 crc kubenswrapper[4975]: E0318 12:10:29.042270 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.127184 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.129051 4975 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="30dda0539eb29f8cab6f1f873631baa420466c4d55dc90132b49178d2134ef3a" exitCode=255 Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.129098 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"30dda0539eb29f8cab6f1f873631baa420466c4d55dc90132b49178d2134ef3a"} Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.129270 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.130197 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.130246 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.130263 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.130825 4975 scope.go:117] "RemoveContainer" containerID="30dda0539eb29f8cab6f1f873631baa420466c4d55dc90132b49178d2134ef3a" Mar 18 12:10:29 crc kubenswrapper[4975]: E0318 12:10:29.269939 4975 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:29 crc kubenswrapper[4975]: E0318 12:10:29.270512 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:29Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 18 12:10:29 crc kubenswrapper[4975]: E0318 12:10:29.273307 4975 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:29Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 12:10:29 crc kubenswrapper[4975]: W0318 12:10:29.273413 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:29Z is after 2026-02-23T05:33:13Z Mar 18 12:10:29 crc kubenswrapper[4975]: E0318 12:10:29.273500 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:29 crc kubenswrapper[4975]: W0318 12:10:29.275663 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:29Z is after 2026-02-23T05:33:13Z Mar 18 12:10:29 crc kubenswrapper[4975]: E0318 12:10:29.275700 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:29 crc kubenswrapper[4975]: E0318 12:10:29.278490 4975 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:29Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189dee4688189909 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:14.952278281 +0000 UTC m=+0.666678900,LastTimestamp:2026-03-18 12:10:14.952278281 +0000 UTC m=+0.666678900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.282152 4975 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.282222 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.289980 4975 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.290043 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.820394 4975 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]log ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]etcd ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/generic-apiserver-start-informers ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/priority-and-fairness-filter ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/start-apiextensions-informers ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/start-apiextensions-controllers ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/crd-informer-synced ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/start-system-namespaces-controller ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 18 12:10:29 crc kubenswrapper[4975]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 18 12:10:29 crc kubenswrapper[4975]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/bootstrap-controller ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/start-kube-aggregator-informers ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/apiservice-registration-controller ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/apiservice-discovery-controller ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]autoregister-completion ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/apiservice-openapi-controller ok Mar 18 12:10:29 crc kubenswrapper[4975]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 18 12:10:29 crc kubenswrapper[4975]: livez check failed Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.820462 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.866037 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.866277 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.867475 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.867534 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.867549 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.901260 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 18 12:10:29 crc kubenswrapper[4975]: I0318 12:10:29.959387 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:29Z is after 2026-02-23T05:33:13Z Mar 18 12:10:30 crc kubenswrapper[4975]: I0318 12:10:30.134489 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 12:10:30 crc kubenswrapper[4975]: I0318 12:10:30.136818 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"36cb5567f62f2481b217d6cc40b709bb6eba1576abee9eef87e1db784c79c14c"} Mar 18 12:10:30 crc kubenswrapper[4975]: I0318 12:10:30.137153 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:30 crc kubenswrapper[4975]: I0318 12:10:30.137295 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:30 crc kubenswrapper[4975]: I0318 12:10:30.138737 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:30 crc kubenswrapper[4975]: I0318 12:10:30.139052 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:30 crc kubenswrapper[4975]: I0318 12:10:30.139099 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:30 crc kubenswrapper[4975]: I0318 12:10:30.138744 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:30 crc kubenswrapper[4975]: I0318 12:10:30.139682 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:30 crc kubenswrapper[4975]: I0318 12:10:30.139718 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:30 crc kubenswrapper[4975]: I0318 12:10:30.162505 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 18 12:10:30 crc kubenswrapper[4975]: I0318 12:10:30.961949 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:30Z is after 2026-02-23T05:33:13Z Mar 18 12:10:31 crc kubenswrapper[4975]: I0318 12:10:31.141857 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 12:10:31 crc kubenswrapper[4975]: I0318 12:10:31.142475 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 12:10:31 crc kubenswrapper[4975]: I0318 12:10:31.145074 4975 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="36cb5567f62f2481b217d6cc40b709bb6eba1576abee9eef87e1db784c79c14c" exitCode=255 Mar 18 12:10:31 crc kubenswrapper[4975]: I0318 12:10:31.145162 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"36cb5567f62f2481b217d6cc40b709bb6eba1576abee9eef87e1db784c79c14c"} Mar 18 12:10:31 crc kubenswrapper[4975]: I0318 12:10:31.145275 4975 scope.go:117] "RemoveContainer" containerID="30dda0539eb29f8cab6f1f873631baa420466c4d55dc90132b49178d2134ef3a" Mar 18 12:10:31 crc kubenswrapper[4975]: I0318 12:10:31.145330 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:31 crc kubenswrapper[4975]: I0318 12:10:31.145673 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:31 crc kubenswrapper[4975]: I0318 12:10:31.146524 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:31 crc kubenswrapper[4975]: I0318 12:10:31.146565 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:31 crc kubenswrapper[4975]: I0318 12:10:31.146582 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:31 crc kubenswrapper[4975]: I0318 12:10:31.147222 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:31 crc kubenswrapper[4975]: I0318 12:10:31.147279 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:31 crc kubenswrapper[4975]: I0318 12:10:31.147305 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:31 crc kubenswrapper[4975]: I0318 12:10:31.147985 4975 scope.go:117] "RemoveContainer" containerID="36cb5567f62f2481b217d6cc40b709bb6eba1576abee9eef87e1db784c79c14c" Mar 18 12:10:31 crc kubenswrapper[4975]: E0318 12:10:31.148293 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:31 crc kubenswrapper[4975]: I0318 12:10:31.958405 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:31Z is after 2026-02-23T05:33:13Z Mar 18 12:10:32 crc kubenswrapper[4975]: I0318 12:10:32.149346 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 12:10:32 crc kubenswrapper[4975]: I0318 12:10:32.961319 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:32Z is after 2026-02-23T05:33:13Z Mar 18 12:10:33 crc kubenswrapper[4975]: W0318 12:10:33.153565 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:33Z is after 2026-02-23T05:33:13Z Mar 18 12:10:33 crc kubenswrapper[4975]: E0318 12:10:33.153716 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:33 crc kubenswrapper[4975]: W0318 12:10:33.192675 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:33Z is after 2026-02-23T05:33:13Z Mar 18 12:10:33 crc kubenswrapper[4975]: E0318 12:10:33.192830 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:33 crc kubenswrapper[4975]: I0318 12:10:33.628017 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:33 crc kubenswrapper[4975]: I0318 12:10:33.628248 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:33 crc kubenswrapper[4975]: I0318 12:10:33.629837 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:33 crc kubenswrapper[4975]: I0318 12:10:33.630016 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:33 crc kubenswrapper[4975]: I0318 12:10:33.630034 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:33 crc kubenswrapper[4975]: I0318 12:10:33.630852 4975 scope.go:117] "RemoveContainer" containerID="36cb5567f62f2481b217d6cc40b709bb6eba1576abee9eef87e1db784c79c14c" Mar 18 12:10:33 crc kubenswrapper[4975]: E0318 12:10:33.631200 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:33 crc kubenswrapper[4975]: I0318 12:10:33.960795 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:33Z is after 2026-02-23T05:33:13Z Mar 18 12:10:34 crc kubenswrapper[4975]: I0318 12:10:34.822655 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:34 crc kubenswrapper[4975]: I0318 12:10:34.822847 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:34 crc kubenswrapper[4975]: I0318 12:10:34.826505 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:34 crc kubenswrapper[4975]: I0318 12:10:34.827199 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:34 crc kubenswrapper[4975]: I0318 12:10:34.827251 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:34 crc kubenswrapper[4975]: I0318 12:10:34.828092 4975 scope.go:117] "RemoveContainer" containerID="36cb5567f62f2481b217d6cc40b709bb6eba1576abee9eef87e1db784c79c14c" Mar 18 12:10:34 crc kubenswrapper[4975]: E0318 12:10:34.828395 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:34 crc kubenswrapper[4975]: I0318 12:10:34.831106 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:34 crc kubenswrapper[4975]: I0318 12:10:34.960899 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:34Z is after 2026-02-23T05:33:13Z Mar 18 12:10:35 crc kubenswrapper[4975]: E0318 12:10:35.099356 4975 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:35 crc kubenswrapper[4975]: I0318 12:10:35.159856 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:35 crc kubenswrapper[4975]: I0318 12:10:35.161686 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:35 crc kubenswrapper[4975]: I0318 12:10:35.161731 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:35 crc kubenswrapper[4975]: I0318 12:10:35.161744 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:35 crc kubenswrapper[4975]: I0318 12:10:35.162436 4975 scope.go:117] "RemoveContainer" containerID="36cb5567f62f2481b217d6cc40b709bb6eba1576abee9eef87e1db784c79c14c" Mar 18 12:10:35 crc kubenswrapper[4975]: E0318 12:10:35.162633 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:35 crc kubenswrapper[4975]: I0318 12:10:35.674193 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:35 crc kubenswrapper[4975]: I0318 12:10:35.675366 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:35 crc kubenswrapper[4975]: I0318 12:10:35.675402 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:35 crc kubenswrapper[4975]: I0318 12:10:35.675414 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:35 crc kubenswrapper[4975]: I0318 12:10:35.675439 4975 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:35 crc kubenswrapper[4975]: E0318 12:10:35.677272 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:35Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 12:10:35 crc kubenswrapper[4975]: E0318 12:10:35.678719 4975 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:35Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 12:10:35 crc kubenswrapper[4975]: I0318 12:10:35.958791 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:35Z is after 2026-02-23T05:33:13Z Mar 18 12:10:36 crc kubenswrapper[4975]: I0318 12:10:36.960722 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:36Z is after 2026-02-23T05:33:13Z Mar 18 12:10:37 crc kubenswrapper[4975]: W0318 12:10:37.726293 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:37Z is after 2026-02-23T05:33:13Z Mar 18 12:10:37 crc kubenswrapper[4975]: E0318 12:10:37.726373 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:37 crc kubenswrapper[4975]: I0318 12:10:37.911849 4975 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 12:10:37 crc kubenswrapper[4975]: I0318 12:10:37.911929 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 12:10:37 crc kubenswrapper[4975]: I0318 12:10:37.912032 4975 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:10:37 crc kubenswrapper[4975]: E0318 12:10:37.915766 4975 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:37 crc kubenswrapper[4975]: I0318 12:10:37.959931 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:37Z is after 2026-02-23T05:33:13Z Mar 18 12:10:38 crc kubenswrapper[4975]: I0318 12:10:38.958511 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:38Z is after 2026-02-23T05:33:13Z Mar 18 12:10:39 crc kubenswrapper[4975]: E0318 12:10:39.282709 4975 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:39Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189dee4688189909 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:14.952278281 +0000 UTC m=+0.666678900,LastTimestamp:2026-03-18 12:10:14.952278281 +0000 UTC m=+0.666678900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:39 crc kubenswrapper[4975]: I0318 12:10:39.960009 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:39Z is after 2026-02-23T05:33:13Z Mar 18 12:10:40 crc kubenswrapper[4975]: I0318 12:10:40.111815 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:40 crc kubenswrapper[4975]: I0318 12:10:40.112067 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:40 crc kubenswrapper[4975]: I0318 12:10:40.113409 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:40 crc kubenswrapper[4975]: I0318 12:10:40.113450 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:40 crc kubenswrapper[4975]: I0318 12:10:40.113462 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:40 crc kubenswrapper[4975]: I0318 12:10:40.114008 4975 scope.go:117] "RemoveContainer" containerID="36cb5567f62f2481b217d6cc40b709bb6eba1576abee9eef87e1db784c79c14c" Mar 18 12:10:40 crc kubenswrapper[4975]: E0318 12:10:40.114171 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:40 crc kubenswrapper[4975]: W0318 12:10:40.566740 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:40Z is after 2026-02-23T05:33:13Z Mar 18 12:10:40 crc kubenswrapper[4975]: E0318 12:10:40.566805 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:40 crc kubenswrapper[4975]: W0318 12:10:40.910193 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:40Z is after 2026-02-23T05:33:13Z Mar 18 12:10:40 crc kubenswrapper[4975]: E0318 12:10:40.910687 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:40 crc kubenswrapper[4975]: I0318 12:10:40.960722 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:40Z is after 2026-02-23T05:33:13Z Mar 18 12:10:41 crc kubenswrapper[4975]: I0318 12:10:41.959630 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:41Z is after 2026-02-23T05:33:13Z Mar 18 12:10:42 crc kubenswrapper[4975]: I0318 12:10:42.678792 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:42 crc kubenswrapper[4975]: I0318 12:10:42.680336 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:42 crc kubenswrapper[4975]: I0318 12:10:42.680380 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:42 crc kubenswrapper[4975]: I0318 12:10:42.680392 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:42 crc kubenswrapper[4975]: I0318 12:10:42.680415 4975 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:42 crc kubenswrapper[4975]: E0318 12:10:42.681024 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:42Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 12:10:42 crc kubenswrapper[4975]: E0318 12:10:42.683506 4975 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:42Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 12:10:42 crc kubenswrapper[4975]: I0318 12:10:42.958220 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:42Z is after 2026-02-23T05:33:13Z Mar 18 12:10:43 crc kubenswrapper[4975]: I0318 12:10:43.959434 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:43Z is after 2026-02-23T05:33:13Z Mar 18 12:10:44 crc kubenswrapper[4975]: I0318 12:10:44.958369 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:44Z is after 2026-02-23T05:33:13Z Mar 18 12:10:45 crc kubenswrapper[4975]: E0318 12:10:45.099583 4975 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:45 crc kubenswrapper[4975]: W0318 12:10:45.283572 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:45Z is after 2026-02-23T05:33:13Z Mar 18 12:10:45 crc kubenswrapper[4975]: E0318 12:10:45.283641 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:10:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 12:10:45 crc kubenswrapper[4975]: I0318 12:10:45.965071 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:46 crc kubenswrapper[4975]: I0318 12:10:46.960665 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:47 crc kubenswrapper[4975]: I0318 12:10:47.616905 4975 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:34780->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 18 12:10:47 crc kubenswrapper[4975]: I0318 12:10:47.616967 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:34780->192.168.126.11:10357: read: connection reset by peer" Mar 18 12:10:47 crc kubenswrapper[4975]: I0318 12:10:47.617016 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:47 crc kubenswrapper[4975]: I0318 12:10:47.617145 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:47 crc kubenswrapper[4975]: I0318 12:10:47.618489 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:47 crc kubenswrapper[4975]: I0318 12:10:47.618563 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:47 crc kubenswrapper[4975]: I0318 12:10:47.618581 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:47 crc kubenswrapper[4975]: I0318 12:10:47.619447 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 12:10:47 crc kubenswrapper[4975]: I0318 12:10:47.619817 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46" gracePeriod=30 Mar 18 12:10:47 crc kubenswrapper[4975]: I0318 12:10:47.958840 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:48 crc kubenswrapper[4975]: I0318 12:10:48.198678 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 12:10:48 crc kubenswrapper[4975]: I0318 12:10:48.199333 4975 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46" exitCode=255 Mar 18 12:10:48 crc kubenswrapper[4975]: I0318 12:10:48.199412 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46"} Mar 18 12:10:48 crc kubenswrapper[4975]: I0318 12:10:48.958993 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:49 crc kubenswrapper[4975]: I0318 12:10:49.203800 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 12:10:49 crc kubenswrapper[4975]: I0318 12:10:49.204247 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c6baed74837a87ab6340d3036ba532bab8aabfe3c751e3af2a483308255eb1e0"} Mar 18 12:10:49 crc kubenswrapper[4975]: I0318 12:10:49.204307 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:49 crc kubenswrapper[4975]: I0318 12:10:49.205386 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:49 crc kubenswrapper[4975]: I0318 12:10:49.205424 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:49 crc kubenswrapper[4975]: I0318 12:10:49.205437 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.288190 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee4688189909 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:14.952278281 +0000 UTC m=+0.666678900,LastTimestamp:2026-03-18 12:10:14.952278281 +0000 UTC m=+0.666678900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.293357 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bedbf48 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016578888 +0000 UTC m=+0.730979467,LastTimestamp:2026-03-18 12:10:15.016578888 +0000 UTC m=+0.730979467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.297656 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bee1786 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016601478 +0000 UTC m=+0.731002067,LastTimestamp:2026-03-18 12:10:15.016601478 +0000 UTC m=+0.731002067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.302159 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bee48bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016614079 +0000 UTC m=+0.731014658,LastTimestamp:2026-03-18 12:10:15.016614079 +0000 UTC m=+0.731014658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.306358 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee469076838b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.092650891 +0000 UTC m=+0.807051470,LastTimestamp:2026-03-18 12:10:15.092650891 +0000 UTC m=+0.807051470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.312177 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bedbf48\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bedbf48 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016578888 +0000 UTC m=+0.730979467,LastTimestamp:2026-03-18 12:10:15.117355115 +0000 UTC m=+0.831755694,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.316292 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bee1786\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bee1786 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016601478 +0000 UTC m=+0.731002067,LastTimestamp:2026-03-18 12:10:15.117376205 +0000 UTC m=+0.831776784,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.320536 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bee48bf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bee48bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016614079 +0000 UTC m=+0.731014658,LastTimestamp:2026-03-18 12:10:15.117386136 +0000 UTC m=+0.831786715,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.324952 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bedbf48\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bedbf48 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016578888 +0000 UTC m=+0.730979467,LastTimestamp:2026-03-18 12:10:15.118236929 +0000 UTC m=+0.832637508,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.328834 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bee1786\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bee1786 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016601478 +0000 UTC m=+0.731002067,LastTimestamp:2026-03-18 12:10:15.118250559 +0000 UTC m=+0.832651138,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.332122 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bee48bf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bee48bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016614079 +0000 UTC m=+0.731014658,LastTimestamp:2026-03-18 12:10:15.118260499 +0000 UTC m=+0.832661078,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.335682 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bedbf48\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bedbf48 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016578888 +0000 UTC m=+0.730979467,LastTimestamp:2026-03-18 12:10:15.118575538 +0000 UTC m=+0.832976117,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.340053 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bee1786\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bee1786 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016601478 +0000 UTC m=+0.731002067,LastTimestamp:2026-03-18 12:10:15.118587688 +0000 UTC m=+0.832988267,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.344853 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bee48bf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bee48bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016614079 +0000 UTC m=+0.731014658,LastTimestamp:2026-03-18 12:10:15.118596738 +0000 UTC m=+0.832997317,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.349983 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bedbf48\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bedbf48 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016578888 +0000 UTC m=+0.730979467,LastTimestamp:2026-03-18 12:10:15.118987849 +0000 UTC m=+0.833388428,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.354592 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bee1786\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bee1786 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016601478 +0000 UTC m=+0.731002067,LastTimestamp:2026-03-18 12:10:15.119002549 +0000 UTC m=+0.833403138,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.359055 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bee48bf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bee48bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016614079 +0000 UTC m=+0.731014658,LastTimestamp:2026-03-18 12:10:15.119010399 +0000 UTC m=+0.833410978,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.363521 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bedbf48\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bedbf48 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016578888 +0000 UTC m=+0.730979467,LastTimestamp:2026-03-18 12:10:15.119291567 +0000 UTC m=+0.833692146,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.368006 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bee1786\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bee1786 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016601478 +0000 UTC m=+0.731002067,LastTimestamp:2026-03-18 12:10:15.119309187 +0000 UTC m=+0.833709766,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.372751 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bee48bf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bee48bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016614079 +0000 UTC m=+0.731014658,LastTimestamp:2026-03-18 12:10:15.119319088 +0000 UTC m=+0.833719667,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.377306 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bedbf48\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bedbf48 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016578888 +0000 UTC m=+0.730979467,LastTimestamp:2026-03-18 12:10:15.120027737 +0000 UTC m=+0.834428336,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.381473 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bee1786\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bee1786 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016601478 +0000 UTC m=+0.731002067,LastTimestamp:2026-03-18 12:10:15.120052867 +0000 UTC m=+0.834453456,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.385117 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bee48bf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bee48bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016614079 +0000 UTC m=+0.731014658,LastTimestamp:2026-03-18 12:10:15.120095939 +0000 UTC m=+0.834496538,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.389097 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bedbf48\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bedbf48 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016578888 +0000 UTC m=+0.730979467,LastTimestamp:2026-03-18 12:10:15.120106409 +0000 UTC m=+0.834506988,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.392507 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189dee468bee1786\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189dee468bee1786 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.016601478 +0000 UTC m=+0.731002067,LastTimestamp:2026-03-18 12:10:15.120117609 +0000 UTC m=+0.834518188,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.397107 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee46a8a924a5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.498622117 +0000 UTC m=+1.213022696,LastTimestamp:2026-03-18 12:10:15.498622117 +0000 UTC m=+1.213022696,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.400585 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee46a8cd63ae openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.50099755 +0000 UTC m=+1.215398139,LastTimestamp:2026-03-18 12:10:15.50099755 +0000 UTC m=+1.215398139,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.404558 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee46a8f54eb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.503613621 +0000 UTC m=+1.218014200,LastTimestamp:2026-03-18 12:10:15.503613621 +0000 UTC m=+1.218014200,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.409647 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee46a99541fa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.514096122 +0000 UTC m=+1.228496701,LastTimestamp:2026-03-18 12:10:15.514096122 +0000 UTC m=+1.228496701,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.414088 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee46a99912c3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:15.514346179 +0000 UTC m=+1.228746748,LastTimestamp:2026-03-18 12:10:15.514346179 +0000 UTC m=+1.228746748,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.418792 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee46cce9d3ea openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.106841066 +0000 UTC m=+1.821241645,LastTimestamp:2026-03-18 12:10:16.106841066 +0000 UTC m=+1.821241645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.422477 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee46ccef3b81 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.107195265 +0000 UTC m=+1.821595844,LastTimestamp:2026-03-18 12:10:16.107195265 +0000 UTC m=+1.821595844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.425725 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee46ccfc8f9f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.108068767 +0000 UTC m=+1.822469346,LastTimestamp:2026-03-18 12:10:16.108068767 +0000 UTC m=+1.822469346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.429329 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee46ccfe9cdf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.108203231 +0000 UTC m=+1.822603820,LastTimestamp:2026-03-18 12:10:16.108203231 +0000 UTC m=+1.822603820,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.432631 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee46cd06ba6c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.108735084 +0000 UTC m=+1.823135663,LastTimestamp:2026-03-18 12:10:16.108735084 +0000 UTC m=+1.823135663,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.435905 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee46cd8c40bb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.117485755 +0000 UTC m=+1.831886334,LastTimestamp:2026-03-18 12:10:16.117485755 +0000 UTC m=+1.831886334,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.439390 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee46cd9c157d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.118523261 +0000 UTC m=+1.832923840,LastTimestamp:2026-03-18 12:10:16.118523261 +0000 UTC m=+1.832923840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.442788 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee46cdb4dcda openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.120147162 +0000 UTC m=+1.834547751,LastTimestamp:2026-03-18 12:10:16.120147162 +0000 UTC m=+1.834547751,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.447269 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee46cddb1b91 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.122653585 +0000 UTC m=+1.837054164,LastTimestamp:2026-03-18 12:10:16.122653585 +0000 UTC m=+1.837054164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.450511 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee46ce1314e0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.126321888 +0000 UTC m=+1.840722467,LastTimestamp:2026-03-18 12:10:16.126321888 +0000 UTC m=+1.840722467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.454555 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee46ce1c3d4f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.126922063 +0000 UTC m=+1.841322642,LastTimestamp:2026-03-18 12:10:16.126922063 +0000 UTC m=+1.841322642,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.458030 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee46e2673f3c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.467382076 +0000 UTC m=+2.181782655,LastTimestamp:2026-03-18 12:10:16.467382076 +0000 UTC m=+2.181782655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.461965 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee46e3249837 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.479791159 +0000 UTC m=+2.194191748,LastTimestamp:2026-03-18 12:10:16.479791159 +0000 UTC m=+2.194191748,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.465337 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee46e3397d6a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.481160554 +0000 UTC m=+2.195561143,LastTimestamp:2026-03-18 12:10:16.481160554 +0000 UTC m=+2.195561143,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.470036 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee46edb274da openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.656860378 +0000 UTC m=+2.371260957,LastTimestamp:2026-03-18 12:10:16.656860378 +0000 UTC m=+2.371260957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.474643 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee46ee62270f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.668374799 +0000 UTC m=+2.382775378,LastTimestamp:2026-03-18 12:10:16.668374799 +0000 UTC m=+2.382775378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.479669 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee46ee718e48 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.669384264 +0000 UTC m=+2.383784843,LastTimestamp:2026-03-18 12:10:16.669384264 +0000 UTC m=+2.383784843,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.483447 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee46f79e2b7e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.823303038 +0000 UTC m=+2.537703617,LastTimestamp:2026-03-18 12:10:16.823303038 +0000 UTC m=+2.537703617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.487963 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee46f8ae7eba openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.841150138 +0000 UTC m=+2.555550717,LastTimestamp:2026-03-18 12:10:16.841150138 +0000 UTC m=+2.555550717,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.492655 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee47041bae6b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.032855147 +0000 UTC m=+2.747255726,LastTimestamp:2026-03-18 12:10:17.032855147 +0000 UTC m=+2.747255726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.497271 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee4705c09557 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.060439383 +0000 UTC m=+2.774839962,LastTimestamp:2026-03-18 12:10:17.060439383 +0000 UTC m=+2.774839962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.501416 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee4705fd2c33 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.064410163 +0000 UTC m=+2.778810742,LastTimestamp:2026-03-18 12:10:17.064410163 +0000 UTC m=+2.778810742,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.505589 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee470617a0d5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.066143957 +0000 UTC m=+2.780544536,LastTimestamp:2026-03-18 12:10:17.066143957 +0000 UTC m=+2.780544536,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.509470 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee470dd4a14e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.195970894 +0000 UTC m=+2.910371483,LastTimestamp:2026-03-18 12:10:17.195970894 +0000 UTC m=+2.910371483,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.515507 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee470e7792ef openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.206649583 +0000 UTC m=+2.921050172,LastTimestamp:2026-03-18 12:10:17.206649583 +0000 UTC m=+2.921050172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.519913 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee470ec9b16b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.212031339 +0000 UTC m=+2.926431918,LastTimestamp:2026-03-18 12:10:17.212031339 +0000 UTC m=+2.926431918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.524296 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee471173fe3e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.256746558 +0000 UTC m=+2.971147137,LastTimestamp:2026-03-18 12:10:17.256746558 +0000 UTC m=+2.971147137,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.529443 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee471187b6ce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.25803899 +0000 UTC m=+2.972439569,LastTimestamp:2026-03-18 12:10:17.25803899 +0000 UTC m=+2.972439569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.533638 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee4711d6a721 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.263212321 +0000 UTC m=+2.977612900,LastTimestamp:2026-03-18 12:10:17.263212321 +0000 UTC m=+2.977612900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.538325 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee471261fc9f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.272343711 +0000 UTC m=+2.986744290,LastTimestamp:2026-03-18 12:10:17.272343711 +0000 UTC m=+2.986744290,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.542824 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee47128358cc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.274529996 +0000 UTC m=+2.988930565,LastTimestamp:2026-03-18 12:10:17.274529996 +0000 UTC m=+2.988930565,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.548146 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee4712c7dd98 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.27902044 +0000 UTC m=+2.993421019,LastTimestamp:2026-03-18 12:10:17.27902044 +0000 UTC m=+2.993421019,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.552059 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189dee4712efa4e6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.281627366 +0000 UTC m=+2.996027945,LastTimestamp:2026-03-18 12:10:17.281627366 +0000 UTC m=+2.996027945,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.557116 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee471a4478a6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.40462711 +0000 UTC m=+3.119027689,LastTimestamp:2026-03-18 12:10:17.40462711 +0000 UTC m=+3.119027689,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.561707 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee471ad5f683 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.414162051 +0000 UTC m=+3.128562630,LastTimestamp:2026-03-18 12:10:17.414162051 +0000 UTC m=+3.128562630,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.566075 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee471ae9ff1e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.415474974 +0000 UTC m=+3.129875553,LastTimestamp:2026-03-18 12:10:17.415474974 +0000 UTC m=+3.129875553,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.571192 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee4721293c5f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.520282719 +0000 UTC m=+3.234683318,LastTimestamp:2026-03-18 12:10:17.520282719 +0000 UTC m=+3.234683318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.575372 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee47221526dd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.535743709 +0000 UTC m=+3.250144308,LastTimestamp:2026-03-18 12:10:17.535743709 +0000 UTC m=+3.250144308,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.579869 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee472224f9e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.536780775 +0000 UTC m=+3.251181354,LastTimestamp:2026-03-18 12:10:17.536780775 +0000 UTC m=+3.251181354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.583554 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee472651c45d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.606825053 +0000 UTC m=+3.321225642,LastTimestamp:2026-03-18 12:10:17.606825053 +0000 UTC m=+3.321225642,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.585742 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189dee4727127848 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.619454024 +0000 UTC m=+3.333854603,LastTimestamp:2026-03-18 12:10:17.619454024 +0000 UTC m=+3.333854603,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.587254 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee472d178fc6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.720451014 +0000 UTC m=+3.434851593,LastTimestamp:2026-03-18 12:10:17.720451014 +0000 UTC m=+3.434851593,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.589432 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee472e162a43 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.737136707 +0000 UTC m=+3.451537286,LastTimestamp:2026-03-18 12:10:17.737136707 +0000 UTC m=+3.451537286,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.592625 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee472e252da5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.738120613 +0000 UTC m=+3.452521192,LastTimestamp:2026-03-18 12:10:17.738120613 +0000 UTC m=+3.452521192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.596590 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee47379fe78f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.897158543 +0000 UTC m=+3.611559162,LastTimestamp:2026-03-18 12:10:17.897158543 +0000 UTC m=+3.611559162,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.600854 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee47385394b7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.908933815 +0000 UTC m=+3.623334434,LastTimestamp:2026-03-18 12:10:17.908933815 +0000 UTC m=+3.623334434,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.605420 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee4738632df2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.909956082 +0000 UTC m=+3.624356651,LastTimestamp:2026-03-18 12:10:17.909956082 +0000 UTC m=+3.624356651,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.609443 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee47428cf047 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:18.080464967 +0000 UTC m=+3.794865546,LastTimestamp:2026-03-18 12:10:18.080464967 +0000 UTC m=+3.794865546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.614155 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee4742c7e307 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:18.084328199 +0000 UTC m=+3.798728778,LastTimestamp:2026-03-18 12:10:18.084328199 +0000 UTC m=+3.798728778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.619478 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee47440deaec openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:18.105694956 +0000 UTC m=+3.820095545,LastTimestamp:2026-03-18 12:10:18.105694956 +0000 UTC m=+3.820095545,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.624425 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee474faddad3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:18.300725971 +0000 UTC m=+4.015126550,LastTimestamp:2026-03-18 12:10:18.300725971 +0000 UTC m=+4.015126550,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.628061 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee475055a1ed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:18.311721453 +0000 UTC m=+4.026122032,LastTimestamp:2026-03-18 12:10:18.311721453 +0000 UTC m=+4.026122032,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.633155 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee477f19fddf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:19.096341983 +0000 UTC m=+4.810742562,LastTimestamp:2026-03-18 12:10:19.096341983 +0000 UTC m=+4.810742562,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.636780 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee47883f0ec4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:19.249766084 +0000 UTC m=+4.964166673,LastTimestamp:2026-03-18 12:10:19.249766084 +0000 UTC m=+4.964166673,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.640135 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee4788d8fd29 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:19.259854121 +0000 UTC m=+4.974254720,LastTimestamp:2026-03-18 12:10:19.259854121 +0000 UTC m=+4.974254720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.645609 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee4788ea8ef0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:19.261005552 +0000 UTC m=+4.975406131,LastTimestamp:2026-03-18 12:10:19.261005552 +0000 UTC m=+4.975406131,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.649988 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee47934b1d10 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:19.435105552 +0000 UTC m=+5.149506131,LastTimestamp:2026-03-18 12:10:19.435105552 +0000 UTC m=+5.149506131,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.654631 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee4793f5958e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:19.446277518 +0000 UTC m=+5.160678097,LastTimestamp:2026-03-18 12:10:19.446277518 +0000 UTC m=+5.160678097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.658741 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee4794068240 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:19.447386688 +0000 UTC m=+5.161787267,LastTimestamp:2026-03-18 12:10:19.447386688 +0000 UTC m=+5.161787267,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.662475 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee479f247127 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:19.633897767 +0000 UTC m=+5.348298346,LastTimestamp:2026-03-18 12:10:19.633897767 +0000 UTC m=+5.348298346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.667041 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee479fac31f3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:19.642794483 +0000 UTC m=+5.357195052,LastTimestamp:2026-03-18 12:10:19.642794483 +0000 UTC m=+5.357195052,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.671056 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee479fbd2e76 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:19.643907702 +0000 UTC m=+5.358308281,LastTimestamp:2026-03-18 12:10:19.643907702 +0000 UTC m=+5.358308281,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.675120 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee47ab25e423 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:19.835319331 +0000 UTC m=+5.549719910,LastTimestamp:2026-03-18 12:10:19.835319331 +0000 UTC m=+5.549719910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.679583 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee47abd87008 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:19.847020552 +0000 UTC m=+5.561421121,LastTimestamp:2026-03-18 12:10:19.847020552 +0000 UTC m=+5.561421121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.683326 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee47abf54980 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:19.848911232 +0000 UTC m=+5.563311811,LastTimestamp:2026-03-18 12:10:19.848911232 +0000 UTC m=+5.563311811,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.683640 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 12:10:49 crc kubenswrapper[4975]: I0318 12:10:49.683780 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:49 crc kubenswrapper[4975]: I0318 12:10:49.685486 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:49 crc kubenswrapper[4975]: I0318 12:10:49.685526 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:49 crc kubenswrapper[4975]: I0318 12:10:49.685538 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:49 crc kubenswrapper[4975]: I0318 12:10:49.685566 4975 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.688317 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee47d0c83d96 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:20.466716054 +0000 UTC m=+6.181116633,LastTimestamp:2026-03-18 12:10:20.466716054 +0000 UTC m=+6.181116633,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.688546 4975 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.689426 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189dee47d1650719 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:20.476991257 +0000 UTC m=+6.191391836,LastTimestamp:2026-03-18 12:10:20.476991257 +0000 UTC m=+6.191391836,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.700283 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 12:10:49 crc kubenswrapper[4975]: &Event{ObjectMeta:{kube-controller-manager-crc.189dee498c8b4978 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 18 12:10:49 crc kubenswrapper[4975]: body: Mar 18 12:10:49 crc kubenswrapper[4975]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:27.911805304 +0000 UTC m=+13.626205933,LastTimestamp:2026-03-18 12:10:27.911805304 +0000 UTC m=+13.626205933,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:49 crc kubenswrapper[4975]: > Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.704640 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee498c8d71fc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:27.911946748 +0000 UTC m=+13.626347377,LastTimestamp:2026-03-18 12:10:27.911946748 +0000 UTC m=+13.626347377,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.710198 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189dee4738632df2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee4738632df2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:17.909956082 +0000 UTC m=+3.624356651,LastTimestamp:2026-03-18 12:10:29.132002661 +0000 UTC m=+14.846403240,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.715363 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 12:10:49 crc kubenswrapper[4975]: &Event{ObjectMeta:{kube-apiserver-crc.189dee49de39ed95 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 12:10:49 crc kubenswrapper[4975]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 12:10:49 crc kubenswrapper[4975]: Mar 18 12:10:49 crc kubenswrapper[4975]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:29.282205077 +0000 UTC m=+14.996605656,LastTimestamp:2026-03-18 12:10:29.282205077 +0000 UTC m=+14.996605656,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:49 crc kubenswrapper[4975]: > Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.719460 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee49de3a8ada openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:29.282245338 +0000 UTC m=+14.996645917,LastTimestamp:2026-03-18 12:10:29.282245338 +0000 UTC m=+14.996645917,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.723620 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189dee49de39ed95\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 12:10:49 crc kubenswrapper[4975]: &Event{ObjectMeta:{kube-apiserver-crc.189dee49de39ed95 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 12:10:49 crc kubenswrapper[4975]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 12:10:49 crc kubenswrapper[4975]: Mar 18 12:10:49 crc kubenswrapper[4975]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:29.282205077 +0000 UTC m=+14.996605656,LastTimestamp:2026-03-18 12:10:29.290027614 +0000 UTC m=+15.004428193,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:49 crc kubenswrapper[4975]: > Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.727108 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189dee49de3a8ada\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee49de3a8ada openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:29.282245338 +0000 UTC m=+14.996645917,LastTimestamp:2026-03-18 12:10:29.290071486 +0000 UTC m=+15.004472065,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.730451 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189dee47428cf047\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee47428cf047 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:18.080464967 +0000 UTC m=+3.794865546,LastTimestamp:2026-03-18 12:10:29.333754795 +0000 UTC m=+15.048155374,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.734952 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189dee47440deaec\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189dee47440deaec openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:18.105694956 +0000 UTC m=+3.820095545,LastTimestamp:2026-03-18 12:10:29.360212117 +0000 UTC m=+15.074612696,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.740641 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 12:10:49 crc kubenswrapper[4975]: &Event{ObjectMeta:{kube-controller-manager-crc.189dee4be098cc93 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 12:10:49 crc kubenswrapper[4975]: body: Mar 18 12:10:49 crc kubenswrapper[4975]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:37.911911571 +0000 UTC m=+23.626312150,LastTimestamp:2026-03-18 12:10:37.911911571 +0000 UTC m=+23.626312150,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:49 crc kubenswrapper[4975]: > Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.745081 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee4be0999106 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:37.911961862 +0000 UTC m=+23.626362441,LastTimestamp:2026-03-18 12:10:37.911961862 +0000 UTC m=+23.626362441,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.749491 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 12:10:49 crc kubenswrapper[4975]: &Event{ObjectMeta:{kube-controller-manager-crc.189dee4e230fe822 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:34780->192.168.126.11:10357: read: connection reset by peer Mar 18 12:10:49 crc kubenswrapper[4975]: body: Mar 18 12:10:49 crc kubenswrapper[4975]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:47.616948258 +0000 UTC m=+33.331348837,LastTimestamp:2026-03-18 12:10:47.616948258 +0000 UTC m=+33.331348837,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:49 crc kubenswrapper[4975]: > Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.752790 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee4e23108689 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:34780->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:47.616988809 +0000 UTC m=+33.331389388,LastTimestamp:2026-03-18 12:10:47.616988809 +0000 UTC m=+33.331389388,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.757396 4975 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee4e233b2393 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:47.619781523 +0000 UTC m=+33.334182162,LastTimestamp:2026-03-18 12:10:47.619781523 +0000 UTC m=+33.334182162,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.762696 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee46cdb4dcda\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee46cdb4dcda openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.120147162 +0000 UTC m=+1.834547751,LastTimestamp:2026-03-18 12:10:48.138474617 +0000 UTC m=+33.852875186,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.767295 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee46e2673f3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee46e2673f3c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.467382076 +0000 UTC m=+2.181782655,LastTimestamp:2026-03-18 12:10:48.301955775 +0000 UTC m=+34.016356364,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: E0318 12:10:49.771003 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee46e3249837\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee46e3249837 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:16.479791159 +0000 UTC m=+2.194191748,LastTimestamp:2026-03-18 12:10:48.309826054 +0000 UTC m=+34.024226643,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:49 crc kubenswrapper[4975]: I0318 12:10:49.960015 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:50 crc kubenswrapper[4975]: I0318 12:10:50.207182 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:50 crc kubenswrapper[4975]: I0318 12:10:50.208507 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:50 crc kubenswrapper[4975]: I0318 12:10:50.208583 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:50 crc kubenswrapper[4975]: I0318 12:10:50.208611 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:50 crc kubenswrapper[4975]: I0318 12:10:50.957741 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.016315 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.017463 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.017498 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.017508 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.018087 4975 scope.go:117] "RemoveContainer" containerID="36cb5567f62f2481b217d6cc40b709bb6eba1576abee9eef87e1db784c79c14c" Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.110948 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.211829 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.214423 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7bc6ab8473be61309fd51ea12a0de5ccaf1056cd20065633623efd6428e66e4c"} Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.214442 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.214617 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.215649 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.215696 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.215712 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.215825 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.215883 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.215899 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:51 crc kubenswrapper[4975]: I0318 12:10:51.963447 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:52 crc kubenswrapper[4975]: I0318 12:10:52.219916 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 12:10:52 crc kubenswrapper[4975]: I0318 12:10:52.220586 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 12:10:52 crc kubenswrapper[4975]: I0318 12:10:52.223041 4975 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7bc6ab8473be61309fd51ea12a0de5ccaf1056cd20065633623efd6428e66e4c" exitCode=255 Mar 18 12:10:52 crc kubenswrapper[4975]: I0318 12:10:52.223088 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7bc6ab8473be61309fd51ea12a0de5ccaf1056cd20065633623efd6428e66e4c"} Mar 18 12:10:52 crc kubenswrapper[4975]: I0318 12:10:52.223125 4975 scope.go:117] "RemoveContainer" containerID="36cb5567f62f2481b217d6cc40b709bb6eba1576abee9eef87e1db784c79c14c" Mar 18 12:10:52 crc kubenswrapper[4975]: I0318 12:10:52.223321 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:52 crc kubenswrapper[4975]: I0318 12:10:52.224473 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:52 crc kubenswrapper[4975]: I0318 12:10:52.224511 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:52 crc kubenswrapper[4975]: I0318 12:10:52.224523 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:52 crc kubenswrapper[4975]: I0318 12:10:52.225127 4975 scope.go:117] "RemoveContainer" containerID="7bc6ab8473be61309fd51ea12a0de5ccaf1056cd20065633623efd6428e66e4c" Mar 18 12:10:52 crc kubenswrapper[4975]: E0318 12:10:52.225333 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:52 crc kubenswrapper[4975]: I0318 12:10:52.959346 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:53 crc kubenswrapper[4975]: I0318 12:10:53.231792 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 12:10:53 crc kubenswrapper[4975]: I0318 12:10:53.628128 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:10:53 crc kubenswrapper[4975]: I0318 12:10:53.628351 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:53 crc kubenswrapper[4975]: I0318 12:10:53.630159 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:53 crc kubenswrapper[4975]: I0318 12:10:53.630222 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:53 crc kubenswrapper[4975]: I0318 12:10:53.630245 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:53 crc kubenswrapper[4975]: I0318 12:10:53.631353 4975 scope.go:117] "RemoveContainer" containerID="7bc6ab8473be61309fd51ea12a0de5ccaf1056cd20065633623efd6428e66e4c" Mar 18 12:10:53 crc kubenswrapper[4975]: E0318 12:10:53.631679 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:10:53 crc kubenswrapper[4975]: I0318 12:10:53.960303 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:54 crc kubenswrapper[4975]: I0318 12:10:54.219628 4975 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 12:10:54 crc kubenswrapper[4975]: I0318 12:10:54.237974 4975 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 12:10:54 crc kubenswrapper[4975]: I0318 12:10:54.911391 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:10:54 crc kubenswrapper[4975]: I0318 12:10:54.911715 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:54 crc kubenswrapper[4975]: I0318 12:10:54.913190 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:54 crc kubenswrapper[4975]: I0318 12:10:54.913243 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:54 crc kubenswrapper[4975]: I0318 12:10:54.913258 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:54 crc kubenswrapper[4975]: I0318 12:10:54.960376 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:55 crc kubenswrapper[4975]: E0318 12:10:55.099752 4975 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:10:55 crc kubenswrapper[4975]: I0318 12:10:55.963432 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:56 crc kubenswrapper[4975]: I0318 12:10:56.688741 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:10:56 crc kubenswrapper[4975]: E0318 12:10:56.688958 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 12:10:56 crc kubenswrapper[4975]: I0318 12:10:56.689802 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:10:56 crc kubenswrapper[4975]: I0318 12:10:56.689850 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:10:56 crc kubenswrapper[4975]: I0318 12:10:56.689886 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:10:56 crc kubenswrapper[4975]: I0318 12:10:56.689916 4975 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:10:56 crc kubenswrapper[4975]: E0318 12:10:56.696524 4975 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 12:10:56 crc kubenswrapper[4975]: I0318 12:10:56.959591 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:57 crc kubenswrapper[4975]: I0318 12:10:57.912457 4975 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 12:10:57 crc kubenswrapper[4975]: I0318 12:10:57.912662 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 12:10:57 crc kubenswrapper[4975]: E0318 12:10:57.919079 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee4be098cc93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 12:10:57 crc kubenswrapper[4975]: &Event{ObjectMeta:{kube-controller-manager-crc.189dee4be098cc93 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 12:10:57 crc kubenswrapper[4975]: body: Mar 18 12:10:57 crc kubenswrapper[4975]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:37.911911571 +0000 UTC m=+23.626312150,LastTimestamp:2026-03-18 12:10:57.912618727 +0000 UTC m=+43.627019396,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 12:10:57 crc kubenswrapper[4975]: > Mar 18 12:10:57 crc kubenswrapper[4975]: E0318 12:10:57.926493 4975 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189dee4be0999106\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189dee4be0999106 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:10:37.911961862 +0000 UTC m=+23.626362441,LastTimestamp:2026-03-18 12:10:57.912706219 +0000 UTC m=+43.627106798,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:10:57 crc kubenswrapper[4975]: I0318 12:10:57.960416 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:58 crc kubenswrapper[4975]: I0318 12:10:58.963104 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:10:59 crc kubenswrapper[4975]: I0318 12:10:59.961459 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:00 crc kubenswrapper[4975]: I0318 12:11:00.111532 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:11:00 crc kubenswrapper[4975]: I0318 12:11:00.111703 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:11:00 crc kubenswrapper[4975]: I0318 12:11:00.112669 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:00 crc kubenswrapper[4975]: I0318 12:11:00.112802 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:00 crc kubenswrapper[4975]: I0318 12:11:00.112913 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:00 crc kubenswrapper[4975]: I0318 12:11:00.113699 4975 scope.go:117] "RemoveContainer" containerID="7bc6ab8473be61309fd51ea12a0de5ccaf1056cd20065633623efd6428e66e4c" Mar 18 12:11:00 crc kubenswrapper[4975]: E0318 12:11:00.114015 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:11:00 crc kubenswrapper[4975]: I0318 12:11:00.960035 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:01 crc kubenswrapper[4975]: W0318 12:11:01.532625 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 18 12:11:01 crc kubenswrapper[4975]: E0318 12:11:01.532708 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 12:11:01 crc kubenswrapper[4975]: I0318 12:11:01.958452 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:02 crc kubenswrapper[4975]: W0318 12:11:02.350738 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 18 12:11:02 crc kubenswrapper[4975]: E0318 12:11:02.350836 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 12:11:02 crc kubenswrapper[4975]: W0318 12:11:02.522844 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 18 12:11:02 crc kubenswrapper[4975]: E0318 12:11:02.522913 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 18 12:11:02 crc kubenswrapper[4975]: I0318 12:11:02.964278 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:03 crc kubenswrapper[4975]: I0318 12:11:03.368076 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 12:11:03 crc kubenswrapper[4975]: I0318 12:11:03.368228 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:11:03 crc kubenswrapper[4975]: I0318 12:11:03.369501 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:03 crc kubenswrapper[4975]: I0318 12:11:03.369568 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:03 crc kubenswrapper[4975]: I0318 12:11:03.369913 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:03 crc kubenswrapper[4975]: E0318 12:11:03.693830 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 12:11:03 crc kubenswrapper[4975]: I0318 12:11:03.696898 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:11:03 crc kubenswrapper[4975]: I0318 12:11:03.698117 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:03 crc kubenswrapper[4975]: I0318 12:11:03.698158 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:03 crc kubenswrapper[4975]: I0318 12:11:03.698171 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:03 crc kubenswrapper[4975]: I0318 12:11:03.698199 4975 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:11:03 crc kubenswrapper[4975]: E0318 12:11:03.702376 4975 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 12:11:03 crc kubenswrapper[4975]: W0318 12:11:03.729128 4975 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:03 crc kubenswrapper[4975]: E0318 12:11:03.729180 4975 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 18 12:11:03 crc kubenswrapper[4975]: I0318 12:11:03.960410 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:04 crc kubenswrapper[4975]: I0318 12:11:04.915513 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:11:04 crc kubenswrapper[4975]: I0318 12:11:04.915648 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:11:04 crc kubenswrapper[4975]: I0318 12:11:04.916584 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:04 crc kubenswrapper[4975]: I0318 12:11:04.916675 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:04 crc kubenswrapper[4975]: I0318 12:11:04.916691 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:04 crc kubenswrapper[4975]: I0318 12:11:04.919246 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:11:04 crc kubenswrapper[4975]: I0318 12:11:04.960626 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:05 crc kubenswrapper[4975]: E0318 12:11:05.099836 4975 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:11:05 crc kubenswrapper[4975]: I0318 12:11:05.265729 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:11:05 crc kubenswrapper[4975]: I0318 12:11:05.267412 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:05 crc kubenswrapper[4975]: I0318 12:11:05.267459 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:05 crc kubenswrapper[4975]: I0318 12:11:05.267475 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:06 crc kubenswrapper[4975]: I0318 12:11:06.062610 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:06 crc kubenswrapper[4975]: I0318 12:11:06.958917 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:07 crc kubenswrapper[4975]: I0318 12:11:07.960334 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:08 crc kubenswrapper[4975]: I0318 12:11:08.962351 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:09 crc kubenswrapper[4975]: I0318 12:11:09.959174 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:10 crc kubenswrapper[4975]: E0318 12:11:10.700474 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 12:11:10 crc kubenswrapper[4975]: I0318 12:11:10.703498 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:11:10 crc kubenswrapper[4975]: I0318 12:11:10.705121 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:10 crc kubenswrapper[4975]: I0318 12:11:10.705169 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:10 crc kubenswrapper[4975]: I0318 12:11:10.705180 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:10 crc kubenswrapper[4975]: I0318 12:11:10.705208 4975 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:11:10 crc kubenswrapper[4975]: E0318 12:11:10.709834 4975 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 12:11:10 crc kubenswrapper[4975]: I0318 12:11:10.959449 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:11 crc kubenswrapper[4975]: I0318 12:11:11.015511 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:11:11 crc kubenswrapper[4975]: I0318 12:11:11.017157 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:11 crc kubenswrapper[4975]: I0318 12:11:11.017195 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:11 crc kubenswrapper[4975]: I0318 12:11:11.017206 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:11 crc kubenswrapper[4975]: I0318 12:11:11.017749 4975 scope.go:117] "RemoveContainer" containerID="7bc6ab8473be61309fd51ea12a0de5ccaf1056cd20065633623efd6428e66e4c" Mar 18 12:11:11 crc kubenswrapper[4975]: E0318 12:11:11.017960 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:11:11 crc kubenswrapper[4975]: I0318 12:11:11.960245 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:12 crc kubenswrapper[4975]: I0318 12:11:12.959149 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:13 crc kubenswrapper[4975]: I0318 12:11:13.958343 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:14 crc kubenswrapper[4975]: I0318 12:11:14.959774 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:15 crc kubenswrapper[4975]: E0318 12:11:15.100023 4975 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:11:15 crc kubenswrapper[4975]: I0318 12:11:15.960349 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:16 crc kubenswrapper[4975]: I0318 12:11:16.960538 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:17 crc kubenswrapper[4975]: E0318 12:11:17.706246 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 12:11:17 crc kubenswrapper[4975]: I0318 12:11:17.710426 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:11:17 crc kubenswrapper[4975]: I0318 12:11:17.711501 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:17 crc kubenswrapper[4975]: I0318 12:11:17.711534 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:17 crc kubenswrapper[4975]: I0318 12:11:17.711546 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:17 crc kubenswrapper[4975]: I0318 12:11:17.711571 4975 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:11:17 crc kubenswrapper[4975]: E0318 12:11:17.715025 4975 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 12:11:17 crc kubenswrapper[4975]: I0318 12:11:17.963952 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:18 crc kubenswrapper[4975]: I0318 12:11:18.959536 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:19 crc kubenswrapper[4975]: I0318 12:11:19.958761 4975 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 12:11:20 crc kubenswrapper[4975]: I0318 12:11:20.112914 4975 csr.go:261] certificate signing request csr-sn9h4 is approved, waiting to be issued Mar 18 12:11:20 crc kubenswrapper[4975]: I0318 12:11:20.120216 4975 csr.go:257] certificate signing request csr-sn9h4 is issued Mar 18 12:11:20 crc kubenswrapper[4975]: I0318 12:11:20.153261 4975 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 18 12:11:20 crc kubenswrapper[4975]: I0318 12:11:20.759573 4975 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 12:11:21 crc kubenswrapper[4975]: I0318 12:11:21.121332 4975 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-07 16:41:13.033407046 +0000 UTC Mar 18 12:11:21 crc kubenswrapper[4975]: I0318 12:11:21.121391 4975 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7084h29m51.91201957s for next certificate rotation Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.015701 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.016930 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.016971 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.016983 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.017700 4975 scope.go:117] "RemoveContainer" containerID="7bc6ab8473be61309fd51ea12a0de5ccaf1056cd20065633623efd6428e66e4c" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.319762 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.322470 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc"} Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.322625 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.323498 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.323529 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.323540 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.715333 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.716492 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.716624 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.716713 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.716908 4975 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.723838 4975 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.724150 4975 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 18 12:11:24 crc kubenswrapper[4975]: E0318 12:11:24.724224 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.726895 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.726931 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.726942 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.726964 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.726975 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4975]: E0318 12:11:24.738864 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.745221 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.745250 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.745260 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.745275 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.745287 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4975]: E0318 12:11:24.755229 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.760563 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.760584 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.760591 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.760606 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.760615 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4975]: E0318 12:11:24.769823 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.776544 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.776583 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.776595 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.776610 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:24 crc kubenswrapper[4975]: I0318 12:11:24.776623 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:24Z","lastTransitionTime":"2026-03-18T12:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:24 crc kubenswrapper[4975]: E0318 12:11:24.786346 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:24 crc kubenswrapper[4975]: E0318 12:11:24.786496 4975 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:11:24 crc kubenswrapper[4975]: E0318 12:11:24.786519 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:24 crc kubenswrapper[4975]: E0318 12:11:24.887429 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:24 crc kubenswrapper[4975]: E0318 12:11:24.988199 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:25 crc kubenswrapper[4975]: E0318 12:11:25.088493 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:25 crc kubenswrapper[4975]: E0318 12:11:25.100753 4975 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 12:11:25 crc kubenswrapper[4975]: E0318 12:11:25.189067 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:25 crc kubenswrapper[4975]: E0318 12:11:25.289412 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:25 crc kubenswrapper[4975]: E0318 12:11:25.389760 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:25 crc kubenswrapper[4975]: E0318 12:11:25.490287 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:25 crc kubenswrapper[4975]: E0318 12:11:25.591146 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:25 crc kubenswrapper[4975]: E0318 12:11:25.691326 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:25 crc kubenswrapper[4975]: E0318 12:11:25.792223 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:25 crc kubenswrapper[4975]: E0318 12:11:25.892511 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:25 crc kubenswrapper[4975]: E0318 12:11:25.993172 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:26 crc kubenswrapper[4975]: E0318 12:11:26.093693 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:26 crc kubenswrapper[4975]: E0318 12:11:26.194780 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:26 crc kubenswrapper[4975]: E0318 12:11:26.295300 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:26 crc kubenswrapper[4975]: I0318 12:11:26.328783 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 12:11:26 crc kubenswrapper[4975]: I0318 12:11:26.329403 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 12:11:26 crc kubenswrapper[4975]: I0318 12:11:26.331469 4975 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc" exitCode=255 Mar 18 12:11:26 crc kubenswrapper[4975]: I0318 12:11:26.331507 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc"} Mar 18 12:11:26 crc kubenswrapper[4975]: I0318 12:11:26.331545 4975 scope.go:117] "RemoveContainer" containerID="7bc6ab8473be61309fd51ea12a0de5ccaf1056cd20065633623efd6428e66e4c" Mar 18 12:11:26 crc kubenswrapper[4975]: I0318 12:11:26.331721 4975 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 12:11:26 crc kubenswrapper[4975]: I0318 12:11:26.333031 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:26 crc kubenswrapper[4975]: I0318 12:11:26.333073 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:26 crc kubenswrapper[4975]: I0318 12:11:26.333084 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:26 crc kubenswrapper[4975]: I0318 12:11:26.333760 4975 scope.go:117] "RemoveContainer" containerID="44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc" Mar 18 12:11:26 crc kubenswrapper[4975]: E0318 12:11:26.333927 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:11:26 crc kubenswrapper[4975]: E0318 12:11:26.395644 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:26 crc kubenswrapper[4975]: E0318 12:11:26.496750 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:26 crc kubenswrapper[4975]: E0318 12:11:26.597314 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:26 crc kubenswrapper[4975]: E0318 12:11:26.697630 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:26 crc kubenswrapper[4975]: E0318 12:11:26.798033 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:26 crc kubenswrapper[4975]: E0318 12:11:26.899128 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:26 crc kubenswrapper[4975]: E0318 12:11:26.999689 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:27 crc kubenswrapper[4975]: E0318 12:11:27.100425 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:27 crc kubenswrapper[4975]: E0318 12:11:27.201462 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:27 crc kubenswrapper[4975]: E0318 12:11:27.301929 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:27 crc kubenswrapper[4975]: I0318 12:11:27.336481 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 12:11:27 crc kubenswrapper[4975]: E0318 12:11:27.403019 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:27 crc kubenswrapper[4975]: E0318 12:11:27.503351 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:27 crc kubenswrapper[4975]: E0318 12:11:27.604063 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:27 crc kubenswrapper[4975]: E0318 12:11:27.705149 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:27 crc kubenswrapper[4975]: E0318 12:11:27.805313 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:27 crc kubenswrapper[4975]: E0318 12:11:27.906383 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:28 crc kubenswrapper[4975]: E0318 12:11:28.007608 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:28 crc kubenswrapper[4975]: E0318 12:11:28.107924 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:28 crc kubenswrapper[4975]: E0318 12:11:28.208962 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:28 crc kubenswrapper[4975]: E0318 12:11:28.310164 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:28 crc kubenswrapper[4975]: E0318 12:11:28.410525 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:28 crc kubenswrapper[4975]: E0318 12:11:28.511032 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:28 crc kubenswrapper[4975]: E0318 12:11:28.611924 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:28 crc kubenswrapper[4975]: E0318 12:11:28.712725 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:28 crc kubenswrapper[4975]: E0318 12:11:28.813268 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:28 crc kubenswrapper[4975]: E0318 12:11:28.913522 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:29 crc kubenswrapper[4975]: E0318 12:11:29.013607 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:29 crc kubenswrapper[4975]: E0318 12:11:29.114171 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:29 crc kubenswrapper[4975]: E0318 12:11:29.214951 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:29 crc kubenswrapper[4975]: E0318 12:11:29.315836 4975 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.324314 4975 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.418800 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.418829 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.418837 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.418849 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.418858 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.521298 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.521341 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.521351 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.521366 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.521378 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.624933 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.624999 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.625021 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.625046 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.625064 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.726792 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.726826 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.726839 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.726856 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.726926 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.829342 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.829386 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.829401 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.829421 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.829435 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.932218 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.932275 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.932292 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.932317 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:29 crc kubenswrapper[4975]: I0318 12:11:29.932335 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:29Z","lastTransitionTime":"2026-03-18T12:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.007332 4975 apiserver.go:52] "Watching apiserver" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.015427 4975 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.016028 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.016715 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.016820 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.016905 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.016955 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.016965 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.017551 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.017663 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.017987 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.018267 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.020147 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.020313 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.021004 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.021222 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.021508 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.021615 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.021759 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.023245 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.023514 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.035509 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.035569 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.035587 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.035613 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.035630 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.056355 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.058713 4975 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.073114 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.087812 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.098172 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.110937 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.111207 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113507 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113548 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113571 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113595 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113617 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113636 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113653 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113681 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113701 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113722 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113744 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113768 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113858 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113893 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113923 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113946 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113968 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.113990 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114015 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114041 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114065 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114088 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114113 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114136 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114157 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114180 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114204 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114227 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114247 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114267 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114288 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114308 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114012 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114036 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114050 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114246 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114253 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.114329 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:11:30.614309858 +0000 UTC m=+76.328710457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114589 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114617 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114634 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114650 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114653 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114672 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114689 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114697 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114704 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114736 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114755 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114777 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114799 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114855 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114892 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114899 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114918 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114942 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114967 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114989 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115012 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115036 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115056 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115078 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115101 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115121 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115142 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115161 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115180 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115200 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115220 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115243 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115263 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115284 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115306 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115327 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115349 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115408 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115436 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115461 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115489 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115516 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115545 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115568 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115595 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115618 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115639 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115663 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115685 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115704 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115726 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115748 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115767 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115790 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115812 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115833 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115856 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115904 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115928 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115953 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115979 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116004 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116031 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116055 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116083 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116109 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116135 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116158 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116182 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116205 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116230 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116252 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116277 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116298 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116318 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116339 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116360 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116382 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116404 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116429 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116456 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116480 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116504 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116524 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116548 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116572 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116594 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116619 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116644 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116668 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116694 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116718 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116743 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116766 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116788 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114911 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114978 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114481 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114513 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114502 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114981 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115028 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115066 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115254 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115248 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115330 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115413 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115409 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.114369 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115510 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115530 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115569 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115572 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115669 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115668 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115682 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115788 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115853 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.115924 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116026 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116349 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116403 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116515 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116589 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116746 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116803 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116944 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.117511 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.118381 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.118648 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.118679 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.118718 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.118783 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.118794 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.118807 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.120376 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.120334 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.120411 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.120496 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.120547 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.121074 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.121341 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.121431 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.121514 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.121663 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.121693 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.121770 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.121811 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.122038 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.122071 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.122199 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.122261 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.122365 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.122519 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.122606 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.122762 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.123217 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.116812 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126062 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126098 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126128 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126155 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126179 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126202 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126229 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126253 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126279 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126303 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126385 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126449 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126476 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126509 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126563 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126590 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126613 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126639 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126662 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.124845 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126686 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.124846 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.124855 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.124954 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.124965 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.125010 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.124941 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.125323 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126776 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126707 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.127755 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128136 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128174 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128207 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128247 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128280 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128314 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128348 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128381 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128413 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128447 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128519 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128556 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128589 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128626 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128659 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128690 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128724 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128758 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128789 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128819 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128850 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128927 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128961 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128994 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129031 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129066 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129099 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129133 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129166 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129199 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129231 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129263 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129542 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129638 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129670 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129707 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129739 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129769 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129802 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129833 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129887 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129919 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129944 4975 scope.go:117] "RemoveContainer" containerID="44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.130131 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129952 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130338 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130365 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130386 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130423 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130451 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130477 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130529 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130558 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130583 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130609 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130635 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130664 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130689 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130712 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130731 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130757 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130788 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130821 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130841 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130976 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131012 4975 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131022 4975 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131032 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131043 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131053 4975 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131063 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131073 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131082 4975 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131092 4975 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131101 4975 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131111 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131121 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131131 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131141 4975 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131150 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131159 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131168 4975 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131177 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131186 4975 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131194 4975 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131203 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131213 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131221 4975 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131230 4975 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131239 4975 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131247 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131256 4975 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131265 4975 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131275 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131285 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131305 4975 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131319 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131331 4975 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131341 4975 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131352 4975 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131363 4975 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131373 4975 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131384 4975 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131394 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131405 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131418 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131429 4975 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131440 4975 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131452 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131467 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131480 4975 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131492 4975 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131504 4975 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131513 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131522 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131531 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131540 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131549 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131558 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131567 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131576 4975 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131585 4975 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131596 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131604 4975 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131613 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131622 4975 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131630 4975 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131640 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131648 4975 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131657 4975 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131665 4975 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131674 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131682 4975 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131691 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131700 4975 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131709 4975 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131717 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131729 4975 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131749 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131763 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131774 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131786 4975 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131800 4975 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131809 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.125338 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.132197 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.125375 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.125498 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.125573 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.125608 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.125641 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126093 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126248 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.126654 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.127173 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.127200 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.127279 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.127390 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.127576 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.127576 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.127588 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.127649 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.127705 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.127785 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128065 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128061 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128154 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128196 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.128787 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.129263 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130332 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130601 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130910 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.130955 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131178 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131282 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131479 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131546 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131584 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131747 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131821 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.131863 4975 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.132582 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:30.632561046 +0000 UTC m=+76.346961625 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.132599 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.132829 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.131880 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.132873 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.132202 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.132224 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.132436 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.132433 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.133170 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.133342 4975 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.134762 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.134793 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:30.634776223 +0000 UTC m=+76.349176802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.133891 4975 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.134694 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.135045 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.135273 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.135758 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.135836 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.135899 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.136196 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.135732 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.136492 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.136600 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.133630 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.133642 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.133566 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.133831 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.133908 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.134006 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.133955 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.134013 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.134242 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.134623 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.136912 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.137108 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.137567 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.137764 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.137970 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.138315 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.138411 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.138732 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.139044 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.139046 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.140728 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.140751 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.140761 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.140776 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.140786 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.140923 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.141345 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.146269 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.146994 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.147411 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.147485 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.147642 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.147659 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.147672 4975 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.147729 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:30.647708624 +0000 UTC m=+76.362109283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.148099 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.148453 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.148656 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.148671 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.148682 4975 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.148725 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:30.648713849 +0000 UTC m=+76.363114518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.156359 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.156381 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.156795 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.156863 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.156387 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.157198 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.157388 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.157325 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.156269 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.157631 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.157701 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.157743 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.157969 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.158022 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.158463 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.161116 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.161930 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.162185 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.162436 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.162519 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.162572 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.162951 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.163005 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.163132 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.163198 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.163478 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.164155 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.164345 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.164449 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.166231 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.166749 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.167341 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.167384 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.167460 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.167631 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.169819 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.169900 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.173251 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.177402 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.179231 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.181899 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.194032 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.198957 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.200745 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.204036 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.212605 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.223424 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.231246 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232708 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232737 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232804 4975 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232816 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232826 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232835 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232843 4975 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232852 4975 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232861 4975 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232883 4975 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232891 4975 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232899 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232906 4975 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232914 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232922 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232930 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232939 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232947 4975 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232954 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232964 4975 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232972 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232980 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232987 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.232995 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233004 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233011 4975 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233021 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233029 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233038 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233046 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233055 4975 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233078 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233087 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233095 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233104 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233113 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233121 4975 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233130 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233138 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233147 4975 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233155 4975 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233165 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233173 4975 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233182 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233190 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233198 4975 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233207 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233215 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233224 4975 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233232 4975 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233239 4975 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233249 4975 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233257 4975 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233264 4975 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233272 4975 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233280 4975 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233288 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233296 4975 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233305 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233313 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233321 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233330 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233338 4975 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233347 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233355 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233363 4975 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233371 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233379 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233388 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233401 4975 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233413 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233423 4975 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233434 4975 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233446 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233457 4975 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233468 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233479 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233492 4975 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233502 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233511 4975 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233519 4975 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233528 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233536 4975 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233544 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233552 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233559 4975 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233567 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233575 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233583 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233591 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233599 4975 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233607 4975 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233614 4975 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233622 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233630 4975 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233638 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233647 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233656 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233664 4975 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233672 4975 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233681 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233689 4975 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233698 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233706 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233714 4975 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233734 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233744 4975 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233752 4975 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233760 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233769 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233777 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233785 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233793 4975 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233800 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233808 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233816 4975 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.233824 4975 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.234116 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.234140 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.243016 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.243040 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.243048 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.243061 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.243069 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.245666 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.338298 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.345486 4975 scope.go:117] "RemoveContainer" containerID="44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.345612 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.345901 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.345920 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.345929 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.345940 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.345948 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.351004 4975 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:30 crc kubenswrapper[4975]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:30 crc kubenswrapper[4975]: set -o allexport Mar 18 12:11:30 crc kubenswrapper[4975]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 12:11:30 crc kubenswrapper[4975]: source /etc/kubernetes/apiserver-url.env Mar 18 12:11:30 crc kubenswrapper[4975]: else Mar 18 12:11:30 crc kubenswrapper[4975]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 12:11:30 crc kubenswrapper[4975]: exit 1 Mar 18 12:11:30 crc kubenswrapper[4975]: fi Mar 18 12:11:30 crc kubenswrapper[4975]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 12:11:30 crc kubenswrapper[4975]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:30 crc kubenswrapper[4975]: > logger="UnhandledError" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.352591 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.353921 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 12:11:30 crc kubenswrapper[4975]: W0318 12:11:30.364326 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-fb70e113c078cc8cee21fc94308284b7c61d6f291ec731dad885bfb51e806694 WatchSource:0}: Error finding container fb70e113c078cc8cee21fc94308284b7c61d6f291ec731dad885bfb51e806694: Status 404 returned error can't find the container with id fb70e113c078cc8cee21fc94308284b7c61d6f291ec731dad885bfb51e806694 Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.365767 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.367321 4975 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:30 crc kubenswrapper[4975]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:30 crc kubenswrapper[4975]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:30 crc kubenswrapper[4975]: set -o allexport Mar 18 12:11:30 crc kubenswrapper[4975]: source "/env/_master" Mar 18 12:11:30 crc kubenswrapper[4975]: set +o allexport Mar 18 12:11:30 crc kubenswrapper[4975]: fi Mar 18 12:11:30 crc kubenswrapper[4975]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 12:11:30 crc kubenswrapper[4975]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 12:11:30 crc kubenswrapper[4975]: ho_enable="--enable-hybrid-overlay" Mar 18 12:11:30 crc kubenswrapper[4975]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 12:11:30 crc kubenswrapper[4975]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 12:11:30 crc kubenswrapper[4975]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 12:11:30 crc kubenswrapper[4975]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:30 crc kubenswrapper[4975]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 12:11:30 crc kubenswrapper[4975]: --webhook-host=127.0.0.1 \ Mar 18 12:11:30 crc kubenswrapper[4975]: --webhook-port=9743 \ Mar 18 12:11:30 crc kubenswrapper[4975]: ${ho_enable} \ Mar 18 12:11:30 crc kubenswrapper[4975]: --enable-interconnect \ Mar 18 12:11:30 crc kubenswrapper[4975]: --disable-approver \ Mar 18 12:11:30 crc kubenswrapper[4975]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 12:11:30 crc kubenswrapper[4975]: --wait-for-kubernetes-api=200s \ Mar 18 12:11:30 crc kubenswrapper[4975]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 12:11:30 crc kubenswrapper[4975]: --loglevel="${LOGLEVEL}" Mar 18 12:11:30 crc kubenswrapper[4975]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:30 crc kubenswrapper[4975]: > logger="UnhandledError" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.371482 4975 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:30 crc kubenswrapper[4975]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:30 crc kubenswrapper[4975]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:30 crc kubenswrapper[4975]: set -o allexport Mar 18 12:11:30 crc kubenswrapper[4975]: source "/env/_master" Mar 18 12:11:30 crc kubenswrapper[4975]: set +o allexport Mar 18 12:11:30 crc kubenswrapper[4975]: fi Mar 18 12:11:30 crc kubenswrapper[4975]: Mar 18 12:11:30 crc kubenswrapper[4975]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 12:11:30 crc kubenswrapper[4975]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:30 crc kubenswrapper[4975]: --disable-webhook \ Mar 18 12:11:30 crc kubenswrapper[4975]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 12:11:30 crc kubenswrapper[4975]: --loglevel="${LOGLEVEL}" Mar 18 12:11:30 crc kubenswrapper[4975]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:30 crc kubenswrapper[4975]: > logger="UnhandledError" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.373233 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 12:11:30 crc kubenswrapper[4975]: W0318 12:11:30.376448 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-181b99ecde497a0e242e67ea058f10879e489fc1aa6b13e59a5980231e05bcdd WatchSource:0}: Error finding container 181b99ecde497a0e242e67ea058f10879e489fc1aa6b13e59a5980231e05bcdd: Status 404 returned error can't find the container with id 181b99ecde497a0e242e67ea058f10879e489fc1aa6b13e59a5980231e05bcdd Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.378401 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.379643 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.448228 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.448624 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.448820 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.449053 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.449349 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.552532 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.552597 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.552619 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.552651 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.552674 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.637822 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.638007 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:11:31.637976388 +0000 UTC m=+77.352377057 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.638393 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.638538 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.638562 4975 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.638650 4975 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.638770 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:31.638750188 +0000 UTC m=+77.353150767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.638853 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:31.63883293 +0000 UTC m=+77.353233539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.654787 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.655096 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.655214 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.655337 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.655428 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.738973 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.739247 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.739162 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.739421 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.739530 4975 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.739330 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.739722 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.739750 4975 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.739641 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:31.739623541 +0000 UTC m=+77.454024120 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:30 crc kubenswrapper[4975]: E0318 12:11:30.739836 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:31.739817216 +0000 UTC m=+77.454217805 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.757538 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.757594 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.757613 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.757638 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.757656 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.860230 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.860314 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.860334 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.860358 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.860376 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.963694 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.963765 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.963792 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.963822 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:30 crc kubenswrapper[4975]: I0318 12:11:30.963845 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:30Z","lastTransitionTime":"2026-03-18T12:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.023529 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.025196 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.027788 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.029106 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.030932 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.032016 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.033259 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.034466 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.035752 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.036596 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.037294 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.038657 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.039223 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.040152 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.040681 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.041724 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.042976 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.043445 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.045190 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.046114 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.046984 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.048506 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.049179 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.050752 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.051590 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.053134 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.054089 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.055160 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.055819 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.056405 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.057288 4975 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.057398 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.059137 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.060101 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.060519 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.062427 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.064019 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.064781 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.066363 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.067531 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.067687 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.067777 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.067931 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.067693 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.068076 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.069015 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.069852 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.071422 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.072918 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.073581 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.075037 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.076069 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.077771 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.078473 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.079754 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.080488 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.081383 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.083175 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.083823 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.084913 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.171153 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.171223 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.171243 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.171266 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.171286 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.274169 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.274526 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.274552 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.274582 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.274604 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.347829 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1f6302738f41d39942bc80eae518653e688b64d4d19619be7828b29390b30d45"} Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.349016 4975 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:31 crc kubenswrapper[4975]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:31 crc kubenswrapper[4975]: set -o allexport Mar 18 12:11:31 crc kubenswrapper[4975]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 12:11:31 crc kubenswrapper[4975]: source /etc/kubernetes/apiserver-url.env Mar 18 12:11:31 crc kubenswrapper[4975]: else Mar 18 12:11:31 crc kubenswrapper[4975]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 12:11:31 crc kubenswrapper[4975]: exit 1 Mar 18 12:11:31 crc kubenswrapper[4975]: fi Mar 18 12:11:31 crc kubenswrapper[4975]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 12:11:31 crc kubenswrapper[4975]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:31 crc kubenswrapper[4975]: > logger="UnhandledError" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.349274 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"181b99ecde497a0e242e67ea058f10879e489fc1aa6b13e59a5980231e05bcdd"} Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.350160 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.351149 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fb70e113c078cc8cee21fc94308284b7c61d6f291ec731dad885bfb51e806694"} Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.352701 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.353151 4975 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:31 crc kubenswrapper[4975]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:31 crc kubenswrapper[4975]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:31 crc kubenswrapper[4975]: set -o allexport Mar 18 12:11:31 crc kubenswrapper[4975]: source "/env/_master" Mar 18 12:11:31 crc kubenswrapper[4975]: set +o allexport Mar 18 12:11:31 crc kubenswrapper[4975]: fi Mar 18 12:11:31 crc kubenswrapper[4975]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 12:11:31 crc kubenswrapper[4975]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 12:11:31 crc kubenswrapper[4975]: ho_enable="--enable-hybrid-overlay" Mar 18 12:11:31 crc kubenswrapper[4975]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 12:11:31 crc kubenswrapper[4975]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 12:11:31 crc kubenswrapper[4975]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 12:11:31 crc kubenswrapper[4975]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:31 crc kubenswrapper[4975]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 12:11:31 crc kubenswrapper[4975]: --webhook-host=127.0.0.1 \ Mar 18 12:11:31 crc kubenswrapper[4975]: --webhook-port=9743 \ Mar 18 12:11:31 crc kubenswrapper[4975]: ${ho_enable} \ Mar 18 12:11:31 crc kubenswrapper[4975]: --enable-interconnect \ Mar 18 12:11:31 crc kubenswrapper[4975]: --disable-approver \ Mar 18 12:11:31 crc kubenswrapper[4975]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 12:11:31 crc kubenswrapper[4975]: --wait-for-kubernetes-api=200s \ Mar 18 12:11:31 crc kubenswrapper[4975]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 12:11:31 crc kubenswrapper[4975]: --loglevel="${LOGLEVEL}" Mar 18 12:11:31 crc kubenswrapper[4975]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:31 crc kubenswrapper[4975]: > logger="UnhandledError" Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.354387 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.356087 4975 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:31 crc kubenswrapper[4975]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:31 crc kubenswrapper[4975]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:31 crc kubenswrapper[4975]: set -o allexport Mar 18 12:11:31 crc kubenswrapper[4975]: source "/env/_master" Mar 18 12:11:31 crc kubenswrapper[4975]: set +o allexport Mar 18 12:11:31 crc kubenswrapper[4975]: fi Mar 18 12:11:31 crc kubenswrapper[4975]: Mar 18 12:11:31 crc kubenswrapper[4975]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 12:11:31 crc kubenswrapper[4975]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:31 crc kubenswrapper[4975]: --disable-webhook \ Mar 18 12:11:31 crc kubenswrapper[4975]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 12:11:31 crc kubenswrapper[4975]: --loglevel="${LOGLEVEL}" Mar 18 12:11:31 crc kubenswrapper[4975]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:31 crc kubenswrapper[4975]: > logger="UnhandledError" Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.357297 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.362143 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.377371 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.377437 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.377454 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.377477 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.377494 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.382033 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.391365 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.401295 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.412253 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.421780 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.433606 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.443892 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.454499 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.464051 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.472970 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.480235 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.480276 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.480312 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.480330 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.480357 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.482008 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.490533 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.497899 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.509091 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.520457 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.582883 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.582946 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.582956 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.582974 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.582986 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.647419 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.647501 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.647530 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.647670 4975 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.647676 4975 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.647726 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:33.647710385 +0000 UTC m=+79.362110974 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.647745 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:11:33.647736136 +0000 UTC m=+79.362136715 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.647794 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:33.647759826 +0000 UTC m=+79.362160495 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.685979 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.686016 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.686024 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.686039 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.686049 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.748942 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.749016 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.749162 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.749171 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.749185 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.749194 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.749202 4975 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.749205 4975 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.749266 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:33.749248775 +0000 UTC m=+79.463649344 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:31 crc kubenswrapper[4975]: E0318 12:11:31.749287 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:33.749279766 +0000 UTC m=+79.463680345 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.788326 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.788386 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.788397 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.788414 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.788444 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.891633 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.891690 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.891704 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.891726 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.891748 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.994306 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.994345 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.994353 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.994367 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:31 crc kubenswrapper[4975]: I0318 12:11:31.994378 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:31Z","lastTransitionTime":"2026-03-18T12:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.016152 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.016245 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:32 crc kubenswrapper[4975]: E0318 12:11:32.016273 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.016362 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:32 crc kubenswrapper[4975]: E0318 12:11:32.016478 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:32 crc kubenswrapper[4975]: E0318 12:11:32.016657 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.096606 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.096684 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.096708 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.096740 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.096762 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.199797 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.199855 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.199890 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.199908 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.199921 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.302930 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.302986 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.303003 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.303026 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.303043 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.405533 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.405575 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.405585 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.405601 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.405610 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.507903 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.508220 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.508323 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.508405 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.508496 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.611438 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.611482 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.611491 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.611504 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.611512 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.713340 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.713377 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.713386 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.713398 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.713406 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.815401 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.815448 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.815460 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.815477 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.815488 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.918338 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.918370 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.918380 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.918395 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:32 crc kubenswrapper[4975]: I0318 12:11:32.918405 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:32Z","lastTransitionTime":"2026-03-18T12:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.019992 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.020022 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.020030 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.020040 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.020050 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.122904 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.122945 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.122954 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.122968 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.122977 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.225533 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.225597 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.225609 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.225625 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.225635 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.328132 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.328188 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.328205 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.328227 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.328244 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.429849 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.429904 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.429915 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.429932 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.429942 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.532456 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.532509 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.532521 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.532539 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.532552 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.628401 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.629178 4975 scope.go:117] "RemoveContainer" containerID="44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc" Mar 18 12:11:33 crc kubenswrapper[4975]: E0318 12:11:33.629388 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.634903 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.634935 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.634947 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.634960 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.634970 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.668687 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.668827 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.668898 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:33 crc kubenswrapper[4975]: E0318 12:11:33.668993 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:11:37.668961242 +0000 UTC m=+83.383361821 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:33 crc kubenswrapper[4975]: E0318 12:11:33.669013 4975 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:33 crc kubenswrapper[4975]: E0318 12:11:33.669079 4975 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:33 crc kubenswrapper[4975]: E0318 12:11:33.669113 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:37.669091735 +0000 UTC m=+83.383492354 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:33 crc kubenswrapper[4975]: E0318 12:11:33.669132 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:37.669123826 +0000 UTC m=+83.383524505 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.737233 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.737296 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.737307 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.737330 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.737341 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.769912 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.769957 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:33 crc kubenswrapper[4975]: E0318 12:11:33.770061 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:33 crc kubenswrapper[4975]: E0318 12:11:33.770071 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:33 crc kubenswrapper[4975]: E0318 12:11:33.770110 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:33 crc kubenswrapper[4975]: E0318 12:11:33.770080 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:33 crc kubenswrapper[4975]: E0318 12:11:33.770132 4975 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:33 crc kubenswrapper[4975]: E0318 12:11:33.770132 4975 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:33 crc kubenswrapper[4975]: E0318 12:11:33.770200 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:37.770178173 +0000 UTC m=+83.484578782 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:33 crc kubenswrapper[4975]: E0318 12:11:33.770225 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:37.770213874 +0000 UTC m=+83.484614493 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.840476 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.840525 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.840534 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.840549 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.840558 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.943680 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.943729 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.943741 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.943761 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:33 crc kubenswrapper[4975]: I0318 12:11:33.943774 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:33Z","lastTransitionTime":"2026-03-18T12:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.015942 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.015942 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:34 crc kubenswrapper[4975]: E0318 12:11:34.016113 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:34 crc kubenswrapper[4975]: E0318 12:11:34.016160 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.015953 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:34 crc kubenswrapper[4975]: E0318 12:11:34.016228 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.045816 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.045903 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.045913 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.045928 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.045937 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.148385 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.148436 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.148448 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.148469 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.148481 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.251021 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.251062 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.251071 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.251086 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.251097 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.354100 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.354171 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.354187 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.354211 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.354226 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.457440 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.457489 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.457505 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.457523 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.457535 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.559662 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.559699 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.559707 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.559722 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.559731 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.662276 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.662343 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.662356 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.662380 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.662397 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.764908 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.764964 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.764980 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.764998 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.765010 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.867272 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.867313 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.867325 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.867340 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.867354 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.970123 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.970437 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.970534 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.970629 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:34 crc kubenswrapper[4975]: I0318 12:11:34.970707 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:34Z","lastTransitionTime":"2026-03-18T12:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.024362 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.037211 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.047030 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.055285 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.061992 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.064533 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.064569 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.064577 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.064592 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.064600 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.071058 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:35 crc kubenswrapper[4975]: E0318 12:11:35.072828 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.076035 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.076188 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.076463 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.076729 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.076825 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.083374 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:35 crc kubenswrapper[4975]: E0318 12:11:35.087671 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.091177 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.091308 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.091466 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.091616 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.091757 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.093163 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:35 crc kubenswrapper[4975]: E0318 12:11:35.100358 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.103144 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.103190 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.103202 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.103219 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.103231 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4975]: E0318 12:11:35.114256 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.117037 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.117085 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.117097 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.117114 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.117381 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4975]: E0318 12:11:35.126898 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:35 crc kubenswrapper[4975]: E0318 12:11:35.127152 4975 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.128856 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.128898 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.128910 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.128925 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.128938 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.231278 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.231338 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.231360 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.231493 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.231523 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.336292 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.336332 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.336341 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.336356 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.336366 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.438650 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.438712 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.438734 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.438760 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.438778 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.541237 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.541276 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.541288 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.541304 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.541315 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.643297 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.643342 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.643354 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.643372 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.643386 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.746379 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.746427 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.746441 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.746459 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.746472 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.848888 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.848932 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.848944 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.848963 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.848975 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.952118 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.952168 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.952180 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.952197 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:35 crc kubenswrapper[4975]: I0318 12:11:35.952209 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:35Z","lastTransitionTime":"2026-03-18T12:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.016385 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.016442 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:36 crc kubenswrapper[4975]: E0318 12:11:36.016559 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.016413 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:36 crc kubenswrapper[4975]: E0318 12:11:36.016777 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:36 crc kubenswrapper[4975]: E0318 12:11:36.017047 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.055525 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.055595 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.055620 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.055650 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.055673 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.158568 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.158642 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.158665 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.158694 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.158715 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.260683 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.260725 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.260738 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.260756 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.260766 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.362460 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.362501 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.362515 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.362529 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.362541 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.464589 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.464636 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.464651 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.464669 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.464684 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.567319 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.567374 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.567388 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.567407 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.567423 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.669518 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.669594 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.669608 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.669625 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.669636 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.773634 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.773687 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.773701 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.773721 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.773736 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.877151 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.877220 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.877414 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.877446 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.877470 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.979600 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.979634 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.979646 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.979661 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:36 crc kubenswrapper[4975]: I0318 12:11:36.979672 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:36Z","lastTransitionTime":"2026-03-18T12:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.082134 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.082213 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.082240 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.082269 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.082290 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.184691 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.184749 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.184779 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.184805 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.184819 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.287973 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.288034 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.288050 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.288075 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.288093 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.391115 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.391161 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.391173 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.391191 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.391200 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.494185 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.494224 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.494234 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.494249 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.494260 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.596429 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.596490 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.596514 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.596534 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.596550 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.699264 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.699306 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.699317 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.699335 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.699346 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.719185 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:37 crc kubenswrapper[4975]: E0318 12:11:37.719297 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:11:45.71927544 +0000 UTC m=+91.433676019 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.719336 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.719394 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:37 crc kubenswrapper[4975]: E0318 12:11:37.719467 4975 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:37 crc kubenswrapper[4975]: E0318 12:11:37.719508 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:45.719498665 +0000 UTC m=+91.433899244 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:37 crc kubenswrapper[4975]: E0318 12:11:37.719536 4975 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:37 crc kubenswrapper[4975]: E0318 12:11:37.719603 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:45.719582127 +0000 UTC m=+91.433982726 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.802081 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.802110 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.802118 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.802130 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.802139 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.819908 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.819959 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:37 crc kubenswrapper[4975]: E0318 12:11:37.820068 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:37 crc kubenswrapper[4975]: E0318 12:11:37.820083 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:37 crc kubenswrapper[4975]: E0318 12:11:37.820095 4975 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:37 crc kubenswrapper[4975]: E0318 12:11:37.820137 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:45.820124822 +0000 UTC m=+91.534525401 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:37 crc kubenswrapper[4975]: E0318 12:11:37.820182 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:37 crc kubenswrapper[4975]: E0318 12:11:37.820190 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:37 crc kubenswrapper[4975]: E0318 12:11:37.820196 4975 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:37 crc kubenswrapper[4975]: E0318 12:11:37.820214 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:11:45.820208164 +0000 UTC m=+91.534608743 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.905138 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.905180 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.905190 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.905212 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:37 crc kubenswrapper[4975]: I0318 12:11:37.905234 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:37Z","lastTransitionTime":"2026-03-18T12:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.007429 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.007462 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.007470 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.007483 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.007491 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.015990 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.016039 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.016064 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:38 crc kubenswrapper[4975]: E0318 12:11:38.016297 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:38 crc kubenswrapper[4975]: E0318 12:11:38.016386 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:38 crc kubenswrapper[4975]: E0318 12:11:38.016478 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.029563 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.109324 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.109371 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.109381 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.109398 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.109409 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.211228 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.211263 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.211275 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.211291 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.211304 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.313654 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.313712 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.313726 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.313742 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.313751 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.416261 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.416313 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.416323 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.416336 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.416363 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.518772 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.518848 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.518883 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.518901 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.518913 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.622498 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.622575 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.622600 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.622633 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.622657 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.725324 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.725386 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.725399 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.725419 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.725430 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.827173 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.827217 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.827232 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.827274 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.827291 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.929555 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.929592 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.929605 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.929620 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:38 crc kubenswrapper[4975]: I0318 12:11:38.929630 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:38Z","lastTransitionTime":"2026-03-18T12:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.031334 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.031384 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.031399 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.031421 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.031436 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.133623 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.133659 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.133669 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.133685 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.133695 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.236087 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.236146 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.236158 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.236175 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.236188 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.338612 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.338670 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.338682 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.338703 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.338718 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.441560 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.441645 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.441658 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.441676 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.441687 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.545202 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.545283 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.545295 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.545314 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.545326 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.647967 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.648013 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.648026 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.648046 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.648064 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.750519 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.750556 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.750569 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.750585 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.750596 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.852701 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.852737 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.852748 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.852765 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.852776 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.955427 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.955478 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.955487 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.955501 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:39 crc kubenswrapper[4975]: I0318 12:11:39.955509 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:39Z","lastTransitionTime":"2026-03-18T12:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.015715 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.015817 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:40 crc kubenswrapper[4975]: E0318 12:11:40.015843 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.015719 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:40 crc kubenswrapper[4975]: E0318 12:11:40.016071 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:40 crc kubenswrapper[4975]: E0318 12:11:40.016207 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.058722 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.058764 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.058776 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.058794 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.058807 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.161333 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.161363 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.161374 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.161390 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.161400 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.264730 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.264814 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.264838 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.264907 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.264931 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.366986 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.367078 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.367096 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.367138 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.367149 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.470189 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.470859 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.471007 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.471149 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.471250 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.574724 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.574777 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.574789 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.574805 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.575189 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.678118 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.678153 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.678162 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.678175 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.678185 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.780521 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.780822 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.780927 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.781048 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.781134 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.883904 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.884195 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.884267 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.884331 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.884395 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.987359 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.987395 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.987403 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.987419 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:40 crc kubenswrapper[4975]: I0318 12:11:40.987430 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:40Z","lastTransitionTime":"2026-03-18T12:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.090112 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.090152 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.090161 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.090198 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.090209 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:41Z","lastTransitionTime":"2026-03-18T12:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.193114 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.193387 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.193685 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.194304 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.194581 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:41Z","lastTransitionTime":"2026-03-18T12:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.297402 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.297733 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.297853 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.297979 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.298075 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:41Z","lastTransitionTime":"2026-03-18T12:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.400426 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.400476 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.400486 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.400501 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.400511 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:41Z","lastTransitionTime":"2026-03-18T12:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.503455 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.503485 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.503493 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.503506 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.503514 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:41Z","lastTransitionTime":"2026-03-18T12:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.606395 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.606672 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.606771 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.606903 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.607017 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:41Z","lastTransitionTime":"2026-03-18T12:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.710088 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.710156 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.710180 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.710211 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.710231 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:41Z","lastTransitionTime":"2026-03-18T12:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.812623 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.812669 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.812680 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.812696 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.812707 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:41Z","lastTransitionTime":"2026-03-18T12:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.915403 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.915437 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.915445 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.915459 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:41 crc kubenswrapper[4975]: I0318 12:11:41.915467 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:41Z","lastTransitionTime":"2026-03-18T12:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.016434 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.016532 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:42 crc kubenswrapper[4975]: E0318 12:11:42.016671 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.016980 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:42 crc kubenswrapper[4975]: E0318 12:11:42.017059 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:42 crc kubenswrapper[4975]: E0318 12:11:42.017124 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.018782 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.018924 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.018939 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.018954 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.018988 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:42Z","lastTransitionTime":"2026-03-18T12:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:42 crc kubenswrapper[4975]: E0318 12:11:42.019054 4975 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:42 crc kubenswrapper[4975]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:42 crc kubenswrapper[4975]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:42 crc kubenswrapper[4975]: set -o allexport Mar 18 12:11:42 crc kubenswrapper[4975]: source "/env/_master" Mar 18 12:11:42 crc kubenswrapper[4975]: set +o allexport Mar 18 12:11:42 crc kubenswrapper[4975]: fi Mar 18 12:11:42 crc kubenswrapper[4975]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 18 12:11:42 crc kubenswrapper[4975]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 18 12:11:42 crc kubenswrapper[4975]: ho_enable="--enable-hybrid-overlay" Mar 18 12:11:42 crc kubenswrapper[4975]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 18 12:11:42 crc kubenswrapper[4975]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 18 12:11:42 crc kubenswrapper[4975]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 18 12:11:42 crc kubenswrapper[4975]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:42 crc kubenswrapper[4975]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 18 12:11:42 crc kubenswrapper[4975]: --webhook-host=127.0.0.1 \ Mar 18 12:11:42 crc kubenswrapper[4975]: --webhook-port=9743 \ Mar 18 12:11:42 crc kubenswrapper[4975]: ${ho_enable} \ Mar 18 12:11:42 crc kubenswrapper[4975]: --enable-interconnect \ Mar 18 12:11:42 crc kubenswrapper[4975]: --disable-approver \ Mar 18 12:11:42 crc kubenswrapper[4975]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 18 12:11:42 crc kubenswrapper[4975]: --wait-for-kubernetes-api=200s \ Mar 18 12:11:42 crc kubenswrapper[4975]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 18 12:11:42 crc kubenswrapper[4975]: --loglevel="${LOGLEVEL}" Mar 18 12:11:42 crc kubenswrapper[4975]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:42 crc kubenswrapper[4975]: > logger="UnhandledError" Mar 18 12:11:42 crc kubenswrapper[4975]: E0318 12:11:42.021327 4975 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:42 crc kubenswrapper[4975]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 18 12:11:42 crc kubenswrapper[4975]: if [[ -f "/env/_master" ]]; then Mar 18 12:11:42 crc kubenswrapper[4975]: set -o allexport Mar 18 12:11:42 crc kubenswrapper[4975]: source "/env/_master" Mar 18 12:11:42 crc kubenswrapper[4975]: set +o allexport Mar 18 12:11:42 crc kubenswrapper[4975]: fi Mar 18 12:11:42 crc kubenswrapper[4975]: Mar 18 12:11:42 crc kubenswrapper[4975]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 18 12:11:42 crc kubenswrapper[4975]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 18 12:11:42 crc kubenswrapper[4975]: --disable-webhook \ Mar 18 12:11:42 crc kubenswrapper[4975]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 18 12:11:42 crc kubenswrapper[4975]: --loglevel="${LOGLEVEL}" Mar 18 12:11:42 crc kubenswrapper[4975]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:42 crc kubenswrapper[4975]: > logger="UnhandledError" Mar 18 12:11:42 crc kubenswrapper[4975]: E0318 12:11:42.022615 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.121733 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.122032 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.122104 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.122169 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.122257 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:42Z","lastTransitionTime":"2026-03-18T12:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.224272 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.224305 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.224314 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.224326 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.224336 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:42Z","lastTransitionTime":"2026-03-18T12:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.326668 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.326725 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.326737 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.326764 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.326779 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:42Z","lastTransitionTime":"2026-03-18T12:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.430450 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.430511 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.430529 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.430553 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.430569 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:42Z","lastTransitionTime":"2026-03-18T12:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.532518 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.532549 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.532560 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.532574 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.532582 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:42Z","lastTransitionTime":"2026-03-18T12:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.634659 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.634701 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.634714 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.634731 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.634745 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:42Z","lastTransitionTime":"2026-03-18T12:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.737457 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.737540 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.737565 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.737594 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.737616 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:42Z","lastTransitionTime":"2026-03-18T12:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.840113 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.840173 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.840182 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.840198 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.840213 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:42Z","lastTransitionTime":"2026-03-18T12:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.942283 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.942344 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.942359 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.942376 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:42 crc kubenswrapper[4975]: I0318 12:11:42.942387 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:42Z","lastTransitionTime":"2026-03-18T12:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.045013 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.045054 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.045065 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.045080 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.045092 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:43Z","lastTransitionTime":"2026-03-18T12:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.147721 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.147755 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.147766 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.147785 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.147796 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:43Z","lastTransitionTime":"2026-03-18T12:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.250403 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.250442 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.250454 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.250469 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.250480 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:43Z","lastTransitionTime":"2026-03-18T12:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.353322 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.353364 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.353379 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.353398 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.353412 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:43Z","lastTransitionTime":"2026-03-18T12:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.455950 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.455988 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.455997 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.456011 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.456020 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:43Z","lastTransitionTime":"2026-03-18T12:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.558351 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.558393 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.558401 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.558416 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.558426 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:43Z","lastTransitionTime":"2026-03-18T12:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.661115 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.661175 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.661187 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.661202 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.661213 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:43Z","lastTransitionTime":"2026-03-18T12:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.764455 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.764540 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.764555 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.764578 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.764595 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:43Z","lastTransitionTime":"2026-03-18T12:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.868200 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.868284 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.868309 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.868342 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.868364 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:43Z","lastTransitionTime":"2026-03-18T12:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.970850 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.970919 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.970931 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.970950 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:43 crc kubenswrapper[4975]: I0318 12:11:43.970960 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:43Z","lastTransitionTime":"2026-03-18T12:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.015646 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.015859 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.015930 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:44 crc kubenswrapper[4975]: E0318 12:11:44.016023 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:44 crc kubenswrapper[4975]: E0318 12:11:44.016188 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:44 crc kubenswrapper[4975]: E0318 12:11:44.016341 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:44 crc kubenswrapper[4975]: E0318 12:11:44.017643 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 18 12:11:44 crc kubenswrapper[4975]: E0318 12:11:44.017807 4975 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:11:44 crc kubenswrapper[4975]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 18 12:11:44 crc kubenswrapper[4975]: set -o allexport Mar 18 12:11:44 crc kubenswrapper[4975]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 18 12:11:44 crc kubenswrapper[4975]: source /etc/kubernetes/apiserver-url.env Mar 18 12:11:44 crc kubenswrapper[4975]: else Mar 18 12:11:44 crc kubenswrapper[4975]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 18 12:11:44 crc kubenswrapper[4975]: exit 1 Mar 18 12:11:44 crc kubenswrapper[4975]: fi Mar 18 12:11:44 crc kubenswrapper[4975]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 18 12:11:44 crc kubenswrapper[4975]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 18 12:11:44 crc kubenswrapper[4975]: > logger="UnhandledError" Mar 18 12:11:44 crc kubenswrapper[4975]: E0318 12:11:44.018891 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 18 12:11:44 crc kubenswrapper[4975]: E0318 12:11:44.018947 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.073670 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.073728 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.073740 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.073755 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.073765 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:44Z","lastTransitionTime":"2026-03-18T12:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.176481 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.176524 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.176532 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.176548 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.176561 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:44Z","lastTransitionTime":"2026-03-18T12:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.281909 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.282263 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.282296 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.282317 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.282335 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:44Z","lastTransitionTime":"2026-03-18T12:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.384574 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.384634 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.384654 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.384681 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.384702 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:44Z","lastTransitionTime":"2026-03-18T12:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.487499 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.487540 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.487549 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.487564 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.487575 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:44Z","lastTransitionTime":"2026-03-18T12:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.590372 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.590435 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.590456 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.590483 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.590505 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:44Z","lastTransitionTime":"2026-03-18T12:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.692211 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.692241 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.692250 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.692263 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.692273 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:44Z","lastTransitionTime":"2026-03-18T12:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.794694 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.794768 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.794782 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.794824 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.794834 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:44Z","lastTransitionTime":"2026-03-18T12:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.897571 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.897607 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.897617 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.897907 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:44 crc kubenswrapper[4975]: I0318 12:11:44.897926 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:44Z","lastTransitionTime":"2026-03-18T12:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.000970 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.001005 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.001016 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.001031 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.001041 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:45Z","lastTransitionTime":"2026-03-18T12:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.024968 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.041787 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.051793 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.060786 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.071938 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.082090 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.091376 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.100657 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.102956 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.102982 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.102992 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.103006 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.103018 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:45Z","lastTransitionTime":"2026-03-18T12:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.110844 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.196492 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.196547 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.196557 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.196573 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.196585 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:45Z","lastTransitionTime":"2026-03-18T12:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.210217 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.214080 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.214119 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.214131 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.214148 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.214160 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:45Z","lastTransitionTime":"2026-03-18T12:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.225000 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.228553 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.228592 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.228603 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.228618 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.228630 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:45Z","lastTransitionTime":"2026-03-18T12:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.240074 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.243227 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.243261 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.243271 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.243286 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.243297 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:45Z","lastTransitionTime":"2026-03-18T12:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.251537 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.255061 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.255125 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.255137 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.255153 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.255163 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:45Z","lastTransitionTime":"2026-03-18T12:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.264332 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.264446 4975 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.265672 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.265701 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.265711 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.265752 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.265760 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:45Z","lastTransitionTime":"2026-03-18T12:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.368623 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.368666 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.368678 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.368695 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.368707 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:45Z","lastTransitionTime":"2026-03-18T12:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.471063 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.471358 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.471467 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.471565 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.471668 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:45Z","lastTransitionTime":"2026-03-18T12:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.573798 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.573842 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.573853 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.573884 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.573896 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:45Z","lastTransitionTime":"2026-03-18T12:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.675737 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.675774 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.675785 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.675799 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.675807 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:45Z","lastTransitionTime":"2026-03-18T12:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.777889 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.777931 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.777942 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.777957 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.777974 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:45Z","lastTransitionTime":"2026-03-18T12:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.792145 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.792197 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.792239 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.792313 4975 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.792354 4975 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.792365 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:01.792329428 +0000 UTC m=+107.506730027 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.792414 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:12:01.792396589 +0000 UTC m=+107.506797178 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.792436 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:12:01.79242801 +0000 UTC m=+107.506828689 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.879858 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.879923 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.879932 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.879952 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.879982 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:45Z","lastTransitionTime":"2026-03-18T12:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.893242 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.893283 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.893400 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.893416 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.893427 4975 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.893423 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.893469 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:12:01.893454287 +0000 UTC m=+107.607854866 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.893470 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.893488 4975 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:45 crc kubenswrapper[4975]: E0318 12:11:45.893555 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:12:01.893532849 +0000 UTC m=+107.607933528 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.981827 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.981860 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.981892 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.981911 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:45 crc kubenswrapper[4975]: I0318 12:11:45.981920 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:45Z","lastTransitionTime":"2026-03-18T12:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.016344 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:46 crc kubenswrapper[4975]: E0318 12:11:46.016461 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.016803 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:46 crc kubenswrapper[4975]: E0318 12:11:46.016890 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.016926 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:46 crc kubenswrapper[4975]: E0318 12:11:46.016966 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.084597 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.084638 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.084649 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.084665 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.084675 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:46Z","lastTransitionTime":"2026-03-18T12:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.186341 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.186370 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.186378 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.186390 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.186399 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:46Z","lastTransitionTime":"2026-03-18T12:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.289183 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.289240 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.289252 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.289267 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.289277 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:46Z","lastTransitionTime":"2026-03-18T12:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.391113 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.391148 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.391159 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.391173 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.391183 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:46Z","lastTransitionTime":"2026-03-18T12:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.493415 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.493460 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.493471 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.493488 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.493501 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:46Z","lastTransitionTime":"2026-03-18T12:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.595152 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.595201 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.595216 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.595240 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.595254 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:46Z","lastTransitionTime":"2026-03-18T12:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.697929 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.697973 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.697982 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.697996 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.698005 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:46Z","lastTransitionTime":"2026-03-18T12:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.799902 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.799937 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.799950 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.799966 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.799977 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:46Z","lastTransitionTime":"2026-03-18T12:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.905196 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.905237 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.905249 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.905266 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:46 crc kubenswrapper[4975]: I0318 12:11:46.905277 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:46Z","lastTransitionTime":"2026-03-18T12:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.007951 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.007984 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.007993 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.008007 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.008017 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:47Z","lastTransitionTime":"2026-03-18T12:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.109621 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.109665 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.109677 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.109694 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.109704 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:47Z","lastTransitionTime":"2026-03-18T12:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.211794 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.211857 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.211911 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.211931 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.211945 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:47Z","lastTransitionTime":"2026-03-18T12:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.313692 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.313740 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.313752 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.313767 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.313778 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:47Z","lastTransitionTime":"2026-03-18T12:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.416420 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.416464 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.416474 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.416489 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.416499 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:47Z","lastTransitionTime":"2026-03-18T12:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.519063 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.519119 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.519133 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.519153 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.519172 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:47Z","lastTransitionTime":"2026-03-18T12:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.622553 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.622593 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.622606 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.622624 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.622637 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:47Z","lastTransitionTime":"2026-03-18T12:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.725451 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.725507 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.725524 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.725544 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.725571 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:47Z","lastTransitionTime":"2026-03-18T12:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.827740 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.827810 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.827824 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.827844 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.827859 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:47Z","lastTransitionTime":"2026-03-18T12:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.930969 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.931036 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.931053 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.931080 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:47 crc kubenswrapper[4975]: I0318 12:11:47.931096 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:47Z","lastTransitionTime":"2026-03-18T12:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.015903 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.015967 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.015999 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:48 crc kubenswrapper[4975]: E0318 12:11:48.016079 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:48 crc kubenswrapper[4975]: E0318 12:11:48.016134 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:48 crc kubenswrapper[4975]: E0318 12:11:48.016512 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.016632 4975 scope.go:117] "RemoveContainer" containerID="44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc" Mar 18 12:11:48 crc kubenswrapper[4975]: E0318 12:11:48.016788 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.033573 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.033609 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.033618 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.033631 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.033641 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:48Z","lastTransitionTime":"2026-03-18T12:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.136226 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.136308 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.136317 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.136332 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.136344 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:48Z","lastTransitionTime":"2026-03-18T12:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.238626 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.238677 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.238689 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.238707 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.238722 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:48Z","lastTransitionTime":"2026-03-18T12:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.341293 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.341331 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.341343 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.341359 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.341370 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:48Z","lastTransitionTime":"2026-03-18T12:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.442900 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.442940 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.442949 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.442964 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.442973 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:48Z","lastTransitionTime":"2026-03-18T12:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.545832 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.545884 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.545896 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.545917 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.545930 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:48Z","lastTransitionTime":"2026-03-18T12:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.648824 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.648885 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.648899 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.648916 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.648927 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:48Z","lastTransitionTime":"2026-03-18T12:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.750659 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.750703 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.750713 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.750738 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.750750 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:48Z","lastTransitionTime":"2026-03-18T12:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.841083 4975 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.853209 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.853240 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.853247 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.853260 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.853269 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:48Z","lastTransitionTime":"2026-03-18T12:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.955825 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.955890 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.955914 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.955930 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:48 crc kubenswrapper[4975]: I0318 12:11:48.955941 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:48Z","lastTransitionTime":"2026-03-18T12:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.057727 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.057782 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.057791 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.057803 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.057813 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:49Z","lastTransitionTime":"2026-03-18T12:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.160273 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.160311 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.160320 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.160334 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.160344 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:49Z","lastTransitionTime":"2026-03-18T12:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.262686 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.262723 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.262734 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.262748 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.262758 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:49Z","lastTransitionTime":"2026-03-18T12:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.364565 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.364608 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.364622 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.364638 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.364649 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:49Z","lastTransitionTime":"2026-03-18T12:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.467139 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.467194 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.467207 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.467227 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.467244 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:49Z","lastTransitionTime":"2026-03-18T12:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.569938 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.569996 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.570013 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.570037 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.570072 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:49Z","lastTransitionTime":"2026-03-18T12:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.673013 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.673076 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.673093 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.673115 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.673132 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:49Z","lastTransitionTime":"2026-03-18T12:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.775838 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.775938 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.775955 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.775977 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.775995 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:49Z","lastTransitionTime":"2026-03-18T12:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.878123 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.878164 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.878175 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.878190 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.878201 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:49Z","lastTransitionTime":"2026-03-18T12:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.980800 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.980858 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.980934 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.980964 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:49 crc kubenswrapper[4975]: I0318 12:11:49.980985 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:49Z","lastTransitionTime":"2026-03-18T12:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.015709 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:50 crc kubenswrapper[4975]: E0318 12:11:50.015971 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.016061 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:50 crc kubenswrapper[4975]: E0318 12:11:50.016257 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.016359 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:50 crc kubenswrapper[4975]: E0318 12:11:50.016508 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.083699 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.083749 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.083767 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.083794 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.083824 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:50Z","lastTransitionTime":"2026-03-18T12:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.186031 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.186068 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.186079 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.186094 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.186106 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:50Z","lastTransitionTime":"2026-03-18T12:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.288923 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.288980 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.288995 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.289014 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.289028 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:50Z","lastTransitionTime":"2026-03-18T12:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.391707 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.391782 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.391800 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.391823 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.391844 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:50Z","lastTransitionTime":"2026-03-18T12:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.400250 4975 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.493900 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.493953 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.493963 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.493978 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.493986 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:50Z","lastTransitionTime":"2026-03-18T12:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.596154 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.596185 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.596193 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.596206 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.596214 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:50Z","lastTransitionTime":"2026-03-18T12:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.698824 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.698887 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.698899 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.698915 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.698926 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:50Z","lastTransitionTime":"2026-03-18T12:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.715531 4975 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.801560 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.801598 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.801606 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.801621 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.801630 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:50Z","lastTransitionTime":"2026-03-18T12:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.903863 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.903908 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.903917 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.903932 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:50 crc kubenswrapper[4975]: I0318 12:11:50.903941 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:50Z","lastTransitionTime":"2026-03-18T12:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.007598 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.007645 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.007656 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.007674 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.007685 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:51Z","lastTransitionTime":"2026-03-18T12:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.110451 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.110488 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.110499 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.110513 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.110522 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:51Z","lastTransitionTime":"2026-03-18T12:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.213752 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.213810 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.213827 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.213849 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.213872 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:51Z","lastTransitionTime":"2026-03-18T12:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.316556 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.316604 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.316616 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.316631 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.316642 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:51Z","lastTransitionTime":"2026-03-18T12:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.419000 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.419060 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.419072 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.419091 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.419102 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:51Z","lastTransitionTime":"2026-03-18T12:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.521525 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.521571 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.521581 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.521596 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.521605 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:51Z","lastTransitionTime":"2026-03-18T12:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.624138 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.624174 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.624183 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.624197 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.624206 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:51Z","lastTransitionTime":"2026-03-18T12:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.726698 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.726753 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.726771 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.726791 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.726801 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:51Z","lastTransitionTime":"2026-03-18T12:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.829459 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.829513 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.829526 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.829545 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.829557 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:51Z","lastTransitionTime":"2026-03-18T12:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.932224 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.932260 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.932271 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.932288 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:51 crc kubenswrapper[4975]: I0318 12:11:51.932299 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:51Z","lastTransitionTime":"2026-03-18T12:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.016100 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.016100 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:52 crc kubenswrapper[4975]: E0318 12:11:52.016260 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.016121 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:52 crc kubenswrapper[4975]: E0318 12:11:52.016378 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:52 crc kubenswrapper[4975]: E0318 12:11:52.016630 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.035242 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.035284 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.035297 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.035314 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.035326 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:52Z","lastTransitionTime":"2026-03-18T12:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.137895 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.137956 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.137974 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.138000 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.138017 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:52Z","lastTransitionTime":"2026-03-18T12:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.240131 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.240167 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.240180 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.240195 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.240206 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:52Z","lastTransitionTime":"2026-03-18T12:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.343292 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.343345 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.343358 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.343378 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.343392 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:52Z","lastTransitionTime":"2026-03-18T12:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.446040 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.446208 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.446224 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.446244 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.446257 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:52Z","lastTransitionTime":"2026-03-18T12:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.549041 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.549111 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.549133 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.549159 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.549178 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:52Z","lastTransitionTime":"2026-03-18T12:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.651957 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.652003 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.652014 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.652030 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.652040 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:52Z","lastTransitionTime":"2026-03-18T12:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.754317 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.754355 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.754363 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.754377 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.754387 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:52Z","lastTransitionTime":"2026-03-18T12:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.856110 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.856158 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.856169 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.856188 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.856234 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:52Z","lastTransitionTime":"2026-03-18T12:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.958949 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.958995 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.959006 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.959023 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:52 crc kubenswrapper[4975]: I0318 12:11:52.959035 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:52Z","lastTransitionTime":"2026-03-18T12:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.061942 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.062013 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.062037 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.062067 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.062087 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:53Z","lastTransitionTime":"2026-03-18T12:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.164954 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.165011 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.165041 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.165064 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.165081 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:53Z","lastTransitionTime":"2026-03-18T12:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.268021 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.268068 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.268085 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.268107 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.268122 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:53Z","lastTransitionTime":"2026-03-18T12:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.370660 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.370696 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.370707 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.370724 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.370736 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:53Z","lastTransitionTime":"2026-03-18T12:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.472576 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.472641 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.472664 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.472691 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.472709 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:53Z","lastTransitionTime":"2026-03-18T12:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.575262 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.575307 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.575318 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.575336 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.575346 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:53Z","lastTransitionTime":"2026-03-18T12:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.678247 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.678285 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.678298 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.678315 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.678326 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:53Z","lastTransitionTime":"2026-03-18T12:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.781762 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.781832 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.781851 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.781913 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.781936 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:53Z","lastTransitionTime":"2026-03-18T12:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.884229 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.884278 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.884290 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.884306 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.884317 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:53Z","lastTransitionTime":"2026-03-18T12:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.987229 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.987274 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.987288 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.987307 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:53 crc kubenswrapper[4975]: I0318 12:11:53.987318 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:53Z","lastTransitionTime":"2026-03-18T12:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.015951 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.016014 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:54 crc kubenswrapper[4975]: E0318 12:11:54.016090 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.016152 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:54 crc kubenswrapper[4975]: E0318 12:11:54.016292 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:54 crc kubenswrapper[4975]: E0318 12:11:54.016380 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.089445 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.089486 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.089496 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.089511 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.089525 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:54Z","lastTransitionTime":"2026-03-18T12:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.191528 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.191561 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.191590 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.191603 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.191613 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:54Z","lastTransitionTime":"2026-03-18T12:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.294132 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.294199 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.294209 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.294222 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.294231 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:54Z","lastTransitionTime":"2026-03-18T12:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.396642 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.396708 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.396731 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.396769 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.396793 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:54Z","lastTransitionTime":"2026-03-18T12:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.501188 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.501227 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.501238 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.501255 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.501265 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:54Z","lastTransitionTime":"2026-03-18T12:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.603902 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.603991 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.604003 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.604021 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.604033 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:54Z","lastTransitionTime":"2026-03-18T12:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.706843 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.706903 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.706919 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.706942 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.706953 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:54Z","lastTransitionTime":"2026-03-18T12:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.809771 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.809803 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.809812 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.809825 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.809834 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:54Z","lastTransitionTime":"2026-03-18T12:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.842999 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5kbr4"] Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.843689 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5kbr4" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.845254 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.845399 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.846102 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.856966 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.875358 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.884199 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.891735 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.900099 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.908047 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.911705 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.911751 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.911762 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.911778 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.911789 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:54Z","lastTransitionTime":"2026-03-18T12:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.915924 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.924273 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.932797 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.940719 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.973424 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae-hosts-file\") pod \"node-resolver-5kbr4\" (UID: \"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\") " pod="openshift-dns/node-resolver-5kbr4" Mar 18 12:11:54 crc kubenswrapper[4975]: I0318 12:11:54.973609 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrc82\" (UniqueName: \"kubernetes.io/projected/06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae-kube-api-access-hrc82\") pod \"node-resolver-5kbr4\" (UID: \"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\") " pod="openshift-dns/node-resolver-5kbr4" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.014109 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.014147 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.014158 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.014174 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.014183 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:55Z","lastTransitionTime":"2026-03-18T12:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.032510 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.050565 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.062103 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.073325 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.074633 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae-hosts-file\") pod \"node-resolver-5kbr4\" (UID: \"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\") " pod="openshift-dns/node-resolver-5kbr4" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.074707 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrc82\" (UniqueName: \"kubernetes.io/projected/06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae-kube-api-access-hrc82\") pod \"node-resolver-5kbr4\" (UID: \"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\") " pod="openshift-dns/node-resolver-5kbr4" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.074811 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae-hosts-file\") pod \"node-resolver-5kbr4\" (UID: \"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\") " pod="openshift-dns/node-resolver-5kbr4" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.084525 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.093051 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.095753 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrc82\" (UniqueName: \"kubernetes.io/projected/06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae-kube-api-access-hrc82\") pod \"node-resolver-5kbr4\" (UID: \"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\") " pod="openshift-dns/node-resolver-5kbr4" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.101559 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.116572 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.116625 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.116636 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.116654 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.116668 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:55Z","lastTransitionTime":"2026-03-18T12:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.120735 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.132663 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.153133 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.158352 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5kbr4" Mar 18 12:11:55 crc kubenswrapper[4975]: W0318 12:11:55.172536 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06dd5a0b_b3ef_4f11_83f7_7e48c2a119ae.slice/crio-f87257fff7cdb2bdc7ee489676e059e51b92096fea7c2ebd10df53b350645f56 WatchSource:0}: Error finding container f87257fff7cdb2bdc7ee489676e059e51b92096fea7c2ebd10df53b350645f56: Status 404 returned error can't find the container with id f87257fff7cdb2bdc7ee489676e059e51b92096fea7c2ebd10df53b350645f56 Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.209665 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-kvdzt"] Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.210243 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-n9j7f"] Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.210405 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.210449 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.212902 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.213071 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.213108 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.213302 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-vwgkw"] Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.213484 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.213787 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.213796 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.214561 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.214758 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.215209 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.215225 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.215405 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.215496 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.215682 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.219119 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.219225 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.219316 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.219396 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.219473 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:55Z","lastTransitionTime":"2026-03-18T12:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.222252 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.232491 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.243821 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.263625 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276235 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/59dd8f35-75c5-42d7-b11a-06586d1d5a1b-mcd-auth-proxy-config\") pod \"machine-config-daemon-kvdzt\" (UID: \"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276317 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-cnibin\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276343 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/60e8a8fd-753d-433d-acf1-fa78d1cc2184-system-cni-dir\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276388 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-system-cni-dir\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276414 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-run-multus-certs\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276433 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-var-lib-cni-multus\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276511 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/60e8a8fd-753d-433d-acf1-fa78d1cc2184-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276547 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59dd8f35-75c5-42d7-b11a-06586d1d5a1b-proxy-tls\") pod \"machine-config-daemon-kvdzt\" (UID: \"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276597 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-hostroot\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276622 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/60e8a8fd-753d-433d-acf1-fa78d1cc2184-cnibin\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276671 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/59dd8f35-75c5-42d7-b11a-06586d1d5a1b-rootfs\") pod \"machine-config-daemon-kvdzt\" (UID: \"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276694 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-var-lib-kubelet\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276795 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzgnr\" (UniqueName: \"kubernetes.io/projected/add6c8de-77cd-42e7-bf06-d2333b9392ea-kube-api-access-zzgnr\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276820 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/60e8a8fd-753d-433d-acf1-fa78d1cc2184-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276840 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-multus-cni-dir\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276865 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntpvc\" (UniqueName: \"kubernetes.io/projected/60e8a8fd-753d-433d-acf1-fa78d1cc2184-kube-api-access-ntpvc\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276901 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/add6c8de-77cd-42e7-bf06-d2333b9392ea-cni-binary-copy\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276947 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-run-k8s-cni-cncf-io\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276976 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/60e8a8fd-753d-433d-acf1-fa78d1cc2184-cni-binary-copy\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.276996 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/add6c8de-77cd-42e7-bf06-d2333b9392ea-multus-daemon-config\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.277015 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-etc-kubernetes\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.277032 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/60e8a8fd-753d-433d-acf1-fa78d1cc2184-os-release\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.277052 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97fm9\" (UniqueName: \"kubernetes.io/projected/59dd8f35-75c5-42d7-b11a-06586d1d5a1b-kube-api-access-97fm9\") pod \"machine-config-daemon-kvdzt\" (UID: \"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.277067 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-os-release\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.277083 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-multus-socket-dir-parent\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.277102 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-run-netns\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.277118 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-var-lib-cni-bin\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.277134 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-multus-conf-dir\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.277706 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.291647 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.305545 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.317206 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.321947 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.322002 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.322016 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.322035 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.322050 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:55Z","lastTransitionTime":"2026-03-18T12:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.327192 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.336325 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.347524 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.357025 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.374525 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378184 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/60e8a8fd-753d-433d-acf1-fa78d1cc2184-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378232 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-multus-cni-dir\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378257 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntpvc\" (UniqueName: \"kubernetes.io/projected/60e8a8fd-753d-433d-acf1-fa78d1cc2184-kube-api-access-ntpvc\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378278 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/add6c8de-77cd-42e7-bf06-d2333b9392ea-cni-binary-copy\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378307 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-run-k8s-cni-cncf-io\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378334 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/60e8a8fd-753d-433d-acf1-fa78d1cc2184-cni-binary-copy\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378353 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/add6c8de-77cd-42e7-bf06-d2333b9392ea-multus-daemon-config\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378377 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-etc-kubernetes\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378398 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/60e8a8fd-753d-433d-acf1-fa78d1cc2184-os-release\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378421 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97fm9\" (UniqueName: \"kubernetes.io/projected/59dd8f35-75c5-42d7-b11a-06586d1d5a1b-kube-api-access-97fm9\") pod \"machine-config-daemon-kvdzt\" (UID: \"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378438 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-os-release\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378457 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-multus-socket-dir-parent\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378463 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-multus-cni-dir\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378512 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-run-netns\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378551 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-etc-kubernetes\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378614 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/60e8a8fd-753d-433d-acf1-fa78d1cc2184-os-release\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378731 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-multus-socket-dir-parent\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378795 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-os-release\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378475 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-run-netns\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378900 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-var-lib-cni-bin\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378923 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-multus-conf-dir\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378954 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/59dd8f35-75c5-42d7-b11a-06586d1d5a1b-mcd-auth-proxy-config\") pod \"machine-config-daemon-kvdzt\" (UID: \"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378975 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-cnibin\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.378994 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/60e8a8fd-753d-433d-acf1-fa78d1cc2184-system-cni-dir\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379014 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-system-cni-dir\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379037 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-run-multus-certs\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379058 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-var-lib-cni-multus\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379077 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/60e8a8fd-753d-433d-acf1-fa78d1cc2184-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379103 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59dd8f35-75c5-42d7-b11a-06586d1d5a1b-proxy-tls\") pod \"machine-config-daemon-kvdzt\" (UID: \"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379120 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-hostroot\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379141 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/60e8a8fd-753d-433d-acf1-fa78d1cc2184-cnibin\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379158 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzgnr\" (UniqueName: \"kubernetes.io/projected/add6c8de-77cd-42e7-bf06-d2333b9392ea-kube-api-access-zzgnr\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379181 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/59dd8f35-75c5-42d7-b11a-06586d1d5a1b-rootfs\") pod \"machine-config-daemon-kvdzt\" (UID: \"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379199 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-var-lib-kubelet\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379235 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/add6c8de-77cd-42e7-bf06-d2333b9392ea-cni-binary-copy\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379244 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-var-lib-kubelet\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379264 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-var-lib-cni-bin\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379284 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-multus-conf-dir\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379319 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-run-k8s-cni-cncf-io\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.379952 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/60e8a8fd-753d-433d-acf1-fa78d1cc2184-cni-binary-copy\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.380040 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/59dd8f35-75c5-42d7-b11a-06586d1d5a1b-mcd-auth-proxy-config\") pod \"machine-config-daemon-kvdzt\" (UID: \"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.380333 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-cnibin\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.380375 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/60e8a8fd-753d-433d-acf1-fa78d1cc2184-system-cni-dir\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.380416 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-system-cni-dir\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.380442 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-run-multus-certs\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.380469 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-host-var-lib-cni-multus\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.380495 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/60e8a8fd-753d-433d-acf1-fa78d1cc2184-cnibin\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.380724 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/add6c8de-77cd-42e7-bf06-d2333b9392ea-multus-daemon-config\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.380801 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/add6c8de-77cd-42e7-bf06-d2333b9392ea-hostroot\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.381136 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/59dd8f35-75c5-42d7-b11a-06586d1d5a1b-rootfs\") pod \"machine-config-daemon-kvdzt\" (UID: \"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.383612 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/60e8a8fd-753d-433d-acf1-fa78d1cc2184-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.385759 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/60e8a8fd-753d-433d-acf1-fa78d1cc2184-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.387726 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59dd8f35-75c5-42d7-b11a-06586d1d5a1b-proxy-tls\") pod \"machine-config-daemon-kvdzt\" (UID: \"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.387714 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.390857 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.390915 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.390928 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.390946 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.390958 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:55Z","lastTransitionTime":"2026-03-18T12:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.395289 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntpvc\" (UniqueName: \"kubernetes.io/projected/60e8a8fd-753d-433d-acf1-fa78d1cc2184-kube-api-access-ntpvc\") pod \"multus-additional-cni-plugins-vwgkw\" (UID: \"60e8a8fd-753d-433d-acf1-fa78d1cc2184\") " pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.398550 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97fm9\" (UniqueName: \"kubernetes.io/projected/59dd8f35-75c5-42d7-b11a-06586d1d5a1b-kube-api-access-97fm9\") pod \"machine-config-daemon-kvdzt\" (UID: \"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\") " pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.402063 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzgnr\" (UniqueName: \"kubernetes.io/projected/add6c8de-77cd-42e7-bf06-d2333b9392ea-kube-api-access-zzgnr\") pod \"multus-n9j7f\" (UID: \"add6c8de-77cd-42e7-bf06-d2333b9392ea\") " pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.405112 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: E0318 12:11:55.405531 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.411082 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.411122 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.411131 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.411148 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.411159 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:55Z","lastTransitionTime":"2026-03-18T12:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.411276 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.411353 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.413492 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5kbr4" event={"ID":"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae","Type":"ContainerStarted","Data":"522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.413539 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5kbr4" event={"ID":"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae","Type":"ContainerStarted","Data":"f87257fff7cdb2bdc7ee489676e059e51b92096fea7c2ebd10df53b350645f56"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.419445 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: E0318 12:11:55.422985 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.426749 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.426804 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.426812 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.426826 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.426846 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:55Z","lastTransitionTime":"2026-03-18T12:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.432322 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: E0318 12:11:55.437829 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.442764 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.442802 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.442817 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.442858 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.442872 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:55Z","lastTransitionTime":"2026-03-18T12:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.443113 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.452963 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: E0318 12:11:55.453732 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.457772 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.457886 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.457902 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.457921 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.457934 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:55Z","lastTransitionTime":"2026-03-18T12:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.462959 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: E0318 12:11:55.470271 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: E0318 12:11:55.470605 4975 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.472821 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.472865 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.472898 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.472919 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.472986 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:55Z","lastTransitionTime":"2026-03-18T12:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.475218 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.485015 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.495500 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.506741 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.517229 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.528442 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.529513 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n9j7f" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.537846 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.538639 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.541747 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.548819 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.564793 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.576895 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-k8v6h"] Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.577828 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.581334 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.581380 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.581393 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.581412 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.581423 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:55Z","lastTransitionTime":"2026-03-18T12:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.587695 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.588062 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.588352 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.588571 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.588731 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.589038 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.589753 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.589855 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.601390 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.610993 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.622448 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.639437 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.649268 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.658277 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.665003 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.682970 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0d0be67-e739-4dd7-abe4-3986a330a037-ovn-node-metrics-cert\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683026 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-ovn\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683059 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-kubelet\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683080 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-ovnkube-script-lib\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683100 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-766zq\" (UniqueName: \"kubernetes.io/projected/b0d0be67-e739-4dd7-abe4-3986a330a037-kube-api-access-766zq\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683121 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-var-lib-openvswitch\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683141 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-systemd-units\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683158 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-etc-openvswitch\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683177 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-node-log\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683206 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-cni-netd\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683230 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-run-netns\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683251 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-systemd\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683335 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-slash\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683390 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-ovnkube-config\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683410 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-env-overrides\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683430 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-cni-bin\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683456 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683519 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-run-ovn-kubernetes\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683551 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-openvswitch\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.683572 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-log-socket\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.687597 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.687631 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.687641 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.687657 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.687670 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:55Z","lastTransitionTime":"2026-03-18T12:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.697307 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.736079 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.774276 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.784269 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0d0be67-e739-4dd7-abe4-3986a330a037-ovn-node-metrics-cert\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.784318 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-ovn\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.784352 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-kubelet\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.784374 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-ovnkube-script-lib\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.784396 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-766zq\" (UniqueName: \"kubernetes.io/projected/b0d0be67-e739-4dd7-abe4-3986a330a037-kube-api-access-766zq\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.784416 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-var-lib-openvswitch\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.784436 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-systemd-units\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.784457 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-etc-openvswitch\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.784469 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-ovn\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.784533 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-node-log\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.784537 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-kubelet\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.784479 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-node-log\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.784995 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-cni-netd\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785015 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-var-lib-openvswitch\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785083 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-etc-openvswitch\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785081 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-systemd-units\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785024 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-run-netns\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785097 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-run-netns\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785109 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-cni-netd\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785138 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-systemd\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785155 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-slash\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785178 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-ovnkube-config\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785193 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-env-overrides\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785206 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-cni-bin\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785220 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785247 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-run-ovn-kubernetes\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785265 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-openvswitch\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785280 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-log-socket\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785318 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-log-socket\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785347 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-ovnkube-script-lib\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785407 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-systemd\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785446 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785475 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-cni-bin\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785507 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-run-ovn-kubernetes\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785539 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-openvswitch\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785580 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-slash\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785737 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-env-overrides\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.785837 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-ovnkube-config\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.787253 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0d0be67-e739-4dd7-abe4-3986a330a037-ovn-node-metrics-cert\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.789599 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.789643 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.789655 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.789673 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.789684 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:55Z","lastTransitionTime":"2026-03-18T12:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.821736 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-766zq\" (UniqueName: \"kubernetes.io/projected/b0d0be67-e739-4dd7-abe4-3986a330a037-kube-api-access-766zq\") pod \"ovnkube-node-k8v6h\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.835943 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.874217 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.891917 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.891989 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.891998 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.892018 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.892026 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:55Z","lastTransitionTime":"2026-03-18T12:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.913153 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.984893 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.993446 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.999699 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.999741 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.999750 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:55 crc kubenswrapper[4975]: I0318 12:11:55.999765 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:55.999774 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:55Z","lastTransitionTime":"2026-03-18T12:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.000085 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.016008 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:56 crc kubenswrapper[4975]: E0318 12:11:56.016153 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.016653 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:56 crc kubenswrapper[4975]: E0318 12:11:56.016788 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.016975 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:56 crc kubenswrapper[4975]: E0318 12:11:56.017061 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.042043 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.187989 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.188318 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.188430 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.188530 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.188609 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:56Z","lastTransitionTime":"2026-03-18T12:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.197992 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.209442 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.219766 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.287312 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.294594 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.294629 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.294640 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.294657 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.294695 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:56Z","lastTransitionTime":"2026-03-18T12:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.301082 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.338904 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.397235 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.397279 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.397289 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.397305 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.397315 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:56Z","lastTransitionTime":"2026-03-18T12:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.417770 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerStarted","Data":"cfd29c8d9ccd4721705fd54efca524b74b6bd018512e056aa0085552d847e87b"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.427262 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.427311 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.427332 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"1ba8561f38000bf75a1f29aad12a7eed3a21ce10a594750a9bd2bfc09346cc80"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.428924 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" event={"ID":"60e8a8fd-753d-433d-acf1-fa78d1cc2184","Type":"ContainerStarted","Data":"54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.428958 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" event={"ID":"60e8a8fd-753d-433d-acf1-fa78d1cc2184","Type":"ContainerStarted","Data":"09011964d3e3b318302498f978991822746bd17ac3f14045ba3bd82c57b5f7e3"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.430692 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9j7f" event={"ID":"add6c8de-77cd-42e7-bf06-d2333b9392ea","Type":"ContainerStarted","Data":"484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.430726 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9j7f" event={"ID":"add6c8de-77cd-42e7-bf06-d2333b9392ea","Type":"ContainerStarted","Data":"6d2740b15dd1c34cc07cf6229c51aee82307e2c86c1d4a10282802758a012f31"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.438760 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.450209 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.462106 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.486394 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.499488 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.499549 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.499566 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.499986 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.500037 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:56Z","lastTransitionTime":"2026-03-18T12:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.501780 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.583121 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.598255 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.602787 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.602833 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.602844 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.602859 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.602872 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:56Z","lastTransitionTime":"2026-03-18T12:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.612371 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.632715 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.705207 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.705233 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.705241 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.705253 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.705276 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:56Z","lastTransitionTime":"2026-03-18T12:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.706002 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.716853 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.753808 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.793617 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.807846 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.807915 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.807927 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.807940 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.807950 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:56Z","lastTransitionTime":"2026-03-18T12:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.839616 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.883465 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.909979 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.910024 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.910035 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.910052 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.910065 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:56Z","lastTransitionTime":"2026-03-18T12:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.914563 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.956735 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:56 crc kubenswrapper[4975]: I0318 12:11:56.996493 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:56Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.012019 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.012053 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.012062 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.012075 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.012085 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:57Z","lastTransitionTime":"2026-03-18T12:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.035355 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.076269 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.113529 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.120605 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.120652 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.120665 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.120683 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.120696 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:57Z","lastTransitionTime":"2026-03-18T12:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.156190 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.198121 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.224112 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.224167 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.224180 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.224197 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.224209 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:57Z","lastTransitionTime":"2026-03-18T12:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.236135 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.282937 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.319198 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.326414 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.326453 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.326462 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.326477 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.326486 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:57Z","lastTransitionTime":"2026-03-18T12:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.360674 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.399994 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.428983 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.429037 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.429049 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.429066 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.429087 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:57Z","lastTransitionTime":"2026-03-18T12:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.433687 4975 generic.go:334] "Generic (PLEG): container finished" podID="60e8a8fd-753d-433d-acf1-fa78d1cc2184" containerID="54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d" exitCode=0 Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.433760 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" event={"ID":"60e8a8fd-753d-433d-acf1-fa78d1cc2184","Type":"ContainerDied","Data":"54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d"} Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.435348 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89"} Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.437858 4975 generic.go:334] "Generic (PLEG): container finished" podID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerID="64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2" exitCode=0 Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.437928 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerDied","Data":"64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2"} Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.446643 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.491166 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.516096 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.530992 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.531019 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.531027 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.531039 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.531049 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:57Z","lastTransitionTime":"2026-03-18T12:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.569276 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.594739 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.632957 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.632988 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.633000 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.633015 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.633027 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:57Z","lastTransitionTime":"2026-03-18T12:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.638047 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.673764 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.723677 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.735321 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.735360 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.735369 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.735383 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.735392 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:57Z","lastTransitionTime":"2026-03-18T12:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.759315 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.798316 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.838108 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.838150 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.838160 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.838176 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.838190 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:57Z","lastTransitionTime":"2026-03-18T12:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.840962 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.876802 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.918811 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.941336 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.941379 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.941390 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.941406 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.941418 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:57Z","lastTransitionTime":"2026-03-18T12:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.954168 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:57 crc kubenswrapper[4975]: I0318 12:11:57.994960 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:57Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.016191 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.016258 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:11:58 crc kubenswrapper[4975]: E0318 12:11:58.016292 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:11:58 crc kubenswrapper[4975]: E0318 12:11:58.016364 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.016410 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:11:58 crc kubenswrapper[4975]: E0318 12:11:58.016451 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.034209 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.043910 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.043941 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.043950 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.043964 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.043973 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:58Z","lastTransitionTime":"2026-03-18T12:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.077169 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.113424 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.147143 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.147193 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.147211 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.147235 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.147253 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:58Z","lastTransitionTime":"2026-03-18T12:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.155184 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.212932 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.233683 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.250237 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.250274 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.250283 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.250297 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.250306 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:58Z","lastTransitionTime":"2026-03-18T12:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.282840 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.324673 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.353796 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.353845 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.353858 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.353903 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.353916 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:58Z","lastTransitionTime":"2026-03-18T12:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.382414 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.415333 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.449366 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.453744 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" event={"ID":"60e8a8fd-753d-433d-acf1-fa78d1cc2184","Type":"ContainerStarted","Data":"ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.455251 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.455280 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.455290 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.455305 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.455314 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:58Z","lastTransitionTime":"2026-03-18T12:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.458705 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerStarted","Data":"14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.458755 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerStarted","Data":"11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.458770 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerStarted","Data":"a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.458781 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerStarted","Data":"9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.458792 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerStarted","Data":"de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.475979 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.516726 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.555839 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.560619 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.560655 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.560666 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.560682 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.560691 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:58Z","lastTransitionTime":"2026-03-18T12:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.600820 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.648209 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.662826 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.662915 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.662930 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.662945 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.662957 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:58Z","lastTransitionTime":"2026-03-18T12:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.689097 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.715858 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.758090 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.765285 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.765325 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.765334 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.765348 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.765360 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:58Z","lastTransitionTime":"2026-03-18T12:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.793659 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.840265 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.868641 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.868699 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.868712 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.868732 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.868750 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:58Z","lastTransitionTime":"2026-03-18T12:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.878121 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.915649 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.956275 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.970724 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.970768 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.970776 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.970791 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.970801 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:58Z","lastTransitionTime":"2026-03-18T12:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:58 crc kubenswrapper[4975]: I0318 12:11:58.994564 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:58Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.035286 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.072646 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.073658 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.073698 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.073720 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.073745 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.073757 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:59Z","lastTransitionTime":"2026-03-18T12:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.176235 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.176271 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.176281 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.176296 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.176307 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:59Z","lastTransitionTime":"2026-03-18T12:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.278696 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.278748 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.278759 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.278776 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.278789 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:59Z","lastTransitionTime":"2026-03-18T12:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.381668 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.381978 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.382068 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.382147 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.382231 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:59Z","lastTransitionTime":"2026-03-18T12:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.465474 4975 generic.go:334] "Generic (PLEG): container finished" podID="60e8a8fd-753d-433d-acf1-fa78d1cc2184" containerID="ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757" exitCode=0 Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.465567 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" event={"ID":"60e8a8fd-753d-433d-acf1-fa78d1cc2184","Type":"ContainerDied","Data":"ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757"} Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.469835 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerStarted","Data":"3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2"} Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.484362 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.484521 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.484584 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.484644 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.484699 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:59Z","lastTransitionTime":"2026-03-18T12:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.488208 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.500686 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.514051 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.525137 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.535772 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.551560 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.564437 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.583514 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.587636 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.587682 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.587696 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.587714 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.587727 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:59Z","lastTransitionTime":"2026-03-18T12:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.596757 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.610605 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.624124 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.642563 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.655208 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.670526 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:11:59Z is after 2025-08-24T17:21:41Z" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.690806 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.690867 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.690900 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.690921 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.690936 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:59Z","lastTransitionTime":"2026-03-18T12:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.793416 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.793454 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.793463 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.793480 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.793490 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:59Z","lastTransitionTime":"2026-03-18T12:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.895701 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.895734 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.895746 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.895761 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.895770 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:59Z","lastTransitionTime":"2026-03-18T12:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.997996 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.998058 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.998073 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.998091 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:11:59 crc kubenswrapper[4975]: I0318 12:11:59.998102 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:11:59Z","lastTransitionTime":"2026-03-18T12:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.016128 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:00 crc kubenswrapper[4975]: E0318 12:12:00.016321 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.016682 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:00 crc kubenswrapper[4975]: E0318 12:12:00.016735 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.016776 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:00 crc kubenswrapper[4975]: E0318 12:12:00.016827 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.017379 4975 scope.go:117] "RemoveContainer" containerID="44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc" Mar 18 12:12:00 crc kubenswrapper[4975]: E0318 12:12:00.017658 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.100500 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.100818 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.100829 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.100844 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.100854 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:00Z","lastTransitionTime":"2026-03-18T12:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.203021 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.203058 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.203067 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.203090 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.203123 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:00Z","lastTransitionTime":"2026-03-18T12:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.307153 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.307216 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.307238 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.307257 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.307276 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:00Z","lastTransitionTime":"2026-03-18T12:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.410019 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.410074 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.410089 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.410107 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.410119 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:00Z","lastTransitionTime":"2026-03-18T12:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.473503 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27"} Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.477288 4975 generic.go:334] "Generic (PLEG): container finished" podID="60e8a8fd-753d-433d-acf1-fa78d1cc2184" containerID="cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609" exitCode=0 Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.477314 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" event={"ID":"60e8a8fd-753d-433d-acf1-fa78d1cc2184","Type":"ContainerDied","Data":"cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609"} Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.497441 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.513250 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.513306 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.513317 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.513332 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.513342 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:00Z","lastTransitionTime":"2026-03-18T12:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.513998 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.526225 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.553819 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.602494 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.613688 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.615398 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.615458 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.615472 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.615491 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.615504 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:00Z","lastTransitionTime":"2026-03-18T12:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.626174 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.638413 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.651789 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.664234 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.684154 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.699788 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.717816 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.717848 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.717858 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.717884 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.717893 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:00Z","lastTransitionTime":"2026-03-18T12:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.720681 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.734483 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.750989 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.763280 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.776910 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.789611 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.801647 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.815782 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.819537 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.819709 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.819844 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.819967 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.820050 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:00Z","lastTransitionTime":"2026-03-18T12:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.831733 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.850384 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.863833 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.878435 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.893111 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.914045 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.922318 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.922361 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.922373 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.922400 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.922412 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:00Z","lastTransitionTime":"2026-03-18T12:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.928642 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:00 crc kubenswrapper[4975]: I0318 12:12:00.949027 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:00Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.024489 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.025216 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.025418 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.025565 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.025675 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:01Z","lastTransitionTime":"2026-03-18T12:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.128569 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.128602 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.128617 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.128632 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.128641 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:01Z","lastTransitionTime":"2026-03-18T12:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.230718 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.230766 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.230778 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.230796 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.230851 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:01Z","lastTransitionTime":"2026-03-18T12:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.334137 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.334201 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.334219 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.334242 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.334258 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:01Z","lastTransitionTime":"2026-03-18T12:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.436775 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.436894 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.436924 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.436957 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.436982 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:01Z","lastTransitionTime":"2026-03-18T12:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.486425 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerStarted","Data":"d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9"} Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.489183 4975 generic.go:334] "Generic (PLEG): container finished" podID="60e8a8fd-753d-433d-acf1-fa78d1cc2184" containerID="3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9" exitCode=0 Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.489227 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" event={"ID":"60e8a8fd-753d-433d-acf1-fa78d1cc2184","Type":"ContainerDied","Data":"3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9"} Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.514738 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.533163 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.539106 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.539142 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.539151 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.539166 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.539175 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:01Z","lastTransitionTime":"2026-03-18T12:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.548791 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.560386 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.573832 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.592186 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.603187 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.612948 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.624279 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.638498 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.642852 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.642921 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.642933 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.643317 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.643349 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:01Z","lastTransitionTime":"2026-03-18T12:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.650424 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.666345 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.674303 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-ksjpq"] Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.674643 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ksjpq" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.677880 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.678020 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.678099 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.678378 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.678909 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.692756 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.705406 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.718419 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.730737 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.746239 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.746313 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.746326 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.746344 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.746357 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:01Z","lastTransitionTime":"2026-03-18T12:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.752935 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.759476 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/380bec4c-fbbe-461f-9a80-56472145eca1-host\") pod \"node-ca-ksjpq\" (UID: \"380bec4c-fbbe-461f-9a80-56472145eca1\") " pod="openshift-image-registry/node-ca-ksjpq" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.759559 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/380bec4c-fbbe-461f-9a80-56472145eca1-serviceca\") pod \"node-ca-ksjpq\" (UID: \"380bec4c-fbbe-461f-9a80-56472145eca1\") " pod="openshift-image-registry/node-ca-ksjpq" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.759618 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmvr6\" (UniqueName: \"kubernetes.io/projected/380bec4c-fbbe-461f-9a80-56472145eca1-kube-api-access-rmvr6\") pod \"node-ca-ksjpq\" (UID: \"380bec4c-fbbe-461f-9a80-56472145eca1\") " pod="openshift-image-registry/node-ca-ksjpq" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.767494 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.783657 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.799667 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.813003 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.836004 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.847560 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.848791 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.848824 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.848833 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.848848 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.848858 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:01Z","lastTransitionTime":"2026-03-18T12:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.857189 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.860322 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.860442 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/380bec4c-fbbe-461f-9a80-56472145eca1-serviceca\") pod \"node-ca-ksjpq\" (UID: \"380bec4c-fbbe-461f-9a80-56472145eca1\") " pod="openshift-image-registry/node-ca-ksjpq" Mar 18 12:12:01 crc kubenswrapper[4975]: E0318 12:12:01.860524 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:33.860498939 +0000 UTC m=+139.574899518 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.860590 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.860639 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.860662 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmvr6\" (UniqueName: \"kubernetes.io/projected/380bec4c-fbbe-461f-9a80-56472145eca1-kube-api-access-rmvr6\") pod \"node-ca-ksjpq\" (UID: \"380bec4c-fbbe-461f-9a80-56472145eca1\") " pod="openshift-image-registry/node-ca-ksjpq" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.860688 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/380bec4c-fbbe-461f-9a80-56472145eca1-host\") pod \"node-ca-ksjpq\" (UID: \"380bec4c-fbbe-461f-9a80-56472145eca1\") " pod="openshift-image-registry/node-ca-ksjpq" Mar 18 12:12:01 crc kubenswrapper[4975]: E0318 12:12:01.860749 4975 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.860771 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/380bec4c-fbbe-461f-9a80-56472145eca1-host\") pod \"node-ca-ksjpq\" (UID: \"380bec4c-fbbe-461f-9a80-56472145eca1\") " pod="openshift-image-registry/node-ca-ksjpq" Mar 18 12:12:01 crc kubenswrapper[4975]: E0318 12:12:01.860771 4975 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:12:01 crc kubenswrapper[4975]: E0318 12:12:01.860832 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:12:33.860815567 +0000 UTC m=+139.575216146 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:12:01 crc kubenswrapper[4975]: E0318 12:12:01.860852 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:12:33.860841558 +0000 UTC m=+139.575242127 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.861602 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/380bec4c-fbbe-461f-9a80-56472145eca1-serviceca\") pod \"node-ca-ksjpq\" (UID: \"380bec4c-fbbe-461f-9a80-56472145eca1\") " pod="openshift-image-registry/node-ca-ksjpq" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.873644 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.905114 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmvr6\" (UniqueName: \"kubernetes.io/projected/380bec4c-fbbe-461f-9a80-56472145eca1-kube-api-access-rmvr6\") pod \"node-ca-ksjpq\" (UID: \"380bec4c-fbbe-461f-9a80-56472145eca1\") " pod="openshift-image-registry/node-ca-ksjpq" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.934966 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.952249 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.952284 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.952292 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.952307 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.952317 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:01Z","lastTransitionTime":"2026-03-18T12:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.961904 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.961980 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:01 crc kubenswrapper[4975]: E0318 12:12:01.962076 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:12:01 crc kubenswrapper[4975]: E0318 12:12:01.962094 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:12:01 crc kubenswrapper[4975]: E0318 12:12:01.962103 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:12:01 crc kubenswrapper[4975]: E0318 12:12:01.962115 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:12:01 crc kubenswrapper[4975]: E0318 12:12:01.962143 4975 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:12:01 crc kubenswrapper[4975]: E0318 12:12:01.962117 4975 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:12:01 crc kubenswrapper[4975]: E0318 12:12:01.962201 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:12:33.962187283 +0000 UTC m=+139.676587862 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:12:01 crc kubenswrapper[4975]: E0318 12:12:01.962218 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:12:33.962211983 +0000 UTC m=+139.676612562 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.972702 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:01Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:01 crc kubenswrapper[4975]: I0318 12:12:01.987550 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ksjpq" Mar 18 12:12:02 crc kubenswrapper[4975]: W0318 12:12:02.003049 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod380bec4c_fbbe_461f_9a80_56472145eca1.slice/crio-1857e6abd2228606bfafcb3da586130cb75e21af591f9bde0615a6ed0b25a4bf WatchSource:0}: Error finding container 1857e6abd2228606bfafcb3da586130cb75e21af591f9bde0615a6ed0b25a4bf: Status 404 returned error can't find the container with id 1857e6abd2228606bfafcb3da586130cb75e21af591f9bde0615a6ed0b25a4bf Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.015750 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.015818 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.015767 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:02 crc kubenswrapper[4975]: E0318 12:12:02.015921 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:02 crc kubenswrapper[4975]: E0318 12:12:02.016032 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:02 crc kubenswrapper[4975]: E0318 12:12:02.016133 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.026166 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.055219 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.055256 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.055264 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.055281 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.055290 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:02Z","lastTransitionTime":"2026-03-18T12:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.157579 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.157906 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.157917 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.157934 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.157946 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:02Z","lastTransitionTime":"2026-03-18T12:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.261830 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.261925 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.261940 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.261957 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.261970 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:02Z","lastTransitionTime":"2026-03-18T12:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.364480 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.364539 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.364551 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.364574 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.364587 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:02Z","lastTransitionTime":"2026-03-18T12:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.467118 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.467270 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.467300 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.467331 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.467356 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:02Z","lastTransitionTime":"2026-03-18T12:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.496456 4975 generic.go:334] "Generic (PLEG): container finished" podID="60e8a8fd-753d-433d-acf1-fa78d1cc2184" containerID="85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16" exitCode=0 Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.496541 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" event={"ID":"60e8a8fd-753d-433d-acf1-fa78d1cc2184","Type":"ContainerDied","Data":"85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16"} Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.499325 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ksjpq" event={"ID":"380bec4c-fbbe-461f-9a80-56472145eca1","Type":"ContainerStarted","Data":"79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c"} Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.499415 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ksjpq" event={"ID":"380bec4c-fbbe-461f-9a80-56472145eca1","Type":"ContainerStarted","Data":"1857e6abd2228606bfafcb3da586130cb75e21af591f9bde0615a6ed0b25a4bf"} Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.515258 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.533141 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.547202 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.563242 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.571835 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.571979 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.572339 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.572617 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.572660 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:02Z","lastTransitionTime":"2026-03-18T12:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.580600 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.595101 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.607030 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.625459 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.647469 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.660237 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.671836 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.675518 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.675562 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.675578 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.675596 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.675607 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:02Z","lastTransitionTime":"2026-03-18T12:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.684625 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.710695 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.722210 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.737366 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.757847 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.770060 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.777742 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.778021 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.778108 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.778197 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.778214 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:02Z","lastTransitionTime":"2026-03-18T12:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.785318 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.797489 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.813087 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.852902 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.880583 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.880648 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.880667 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.880711 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.880743 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:02Z","lastTransitionTime":"2026-03-18T12:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.893266 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.934943 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.972662 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:02Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.983301 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.983330 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.983340 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.983354 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:02 crc kubenswrapper[4975]: I0318 12:12:02.983365 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:02Z","lastTransitionTime":"2026-03-18T12:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.019696 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.052176 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.086022 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.086072 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.086086 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.086103 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.086116 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:03Z","lastTransitionTime":"2026-03-18T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.097525 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.135596 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.177819 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.187956 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.187992 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.188004 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.188019 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.188031 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:03Z","lastTransitionTime":"2026-03-18T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.214764 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.290005 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.290325 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.290337 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.290351 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.290360 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:03Z","lastTransitionTime":"2026-03-18T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.393973 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.394031 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.394048 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.394071 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.394087 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:03Z","lastTransitionTime":"2026-03-18T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.496608 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.496653 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.496665 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.496683 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.496697 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:03Z","lastTransitionTime":"2026-03-18T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.506672 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerStarted","Data":"16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91"} Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.506945 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.506963 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.514854 4975 generic.go:334] "Generic (PLEG): container finished" podID="60e8a8fd-753d-433d-acf1-fa78d1cc2184" containerID="8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f" exitCode=0 Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.514914 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" event={"ID":"60e8a8fd-753d-433d-acf1-fa78d1cc2184","Type":"ContainerDied","Data":"8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f"} Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.521382 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.538976 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.540048 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.550618 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.559321 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.568049 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.577834 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.595456 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.604591 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.604635 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.604645 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.604660 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.604670 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:03Z","lastTransitionTime":"2026-03-18T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.609601 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.630135 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.643066 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.655111 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.695918 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.708497 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.708533 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.708542 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.708557 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.708566 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:03Z","lastTransitionTime":"2026-03-18T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.738822 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.773495 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.811317 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.811345 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.811353 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.811365 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.811374 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:03Z","lastTransitionTime":"2026-03-18T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.816213 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.855240 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.895364 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.914095 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.914136 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.914147 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.914163 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.914173 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:03Z","lastTransitionTime":"2026-03-18T12:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.937431 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:03 crc kubenswrapper[4975]: I0318 12:12:03.975393 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:03Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.015138 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.015345 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:04 crc kubenswrapper[4975]: E0318 12:12:04.015432 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.015514 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:04 crc kubenswrapper[4975]: E0318 12:12:04.015619 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.015732 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:04 crc kubenswrapper[4975]: E0318 12:12:04.015789 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.016206 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.016232 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.016242 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.016255 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.016266 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:04Z","lastTransitionTime":"2026-03-18T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.059089 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.093805 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.118235 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.118307 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.118319 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.118336 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.118350 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:04Z","lastTransitionTime":"2026-03-18T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.133070 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.174336 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.214472 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.221065 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.221134 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.221150 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.221199 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.221211 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:04Z","lastTransitionTime":"2026-03-18T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.261265 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.296595 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.323498 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.323537 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.323548 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.323564 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.323576 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:04Z","lastTransitionTime":"2026-03-18T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.335628 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.379736 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.413781 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.425504 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.425548 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.425559 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.425573 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.425582 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:04Z","lastTransitionTime":"2026-03-18T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.523498 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" event={"ID":"60e8a8fd-753d-433d-acf1-fa78d1cc2184","Type":"ContainerStarted","Data":"df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58"} Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.523594 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.528252 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.528305 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.528320 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.528340 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.528355 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:04Z","lastTransitionTime":"2026-03-18T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.544610 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.546694 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.561556 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.572509 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.583067 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.631214 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.631267 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.631279 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.631297 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.631318 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:04Z","lastTransitionTime":"2026-03-18T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.632902 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.666111 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.694530 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.733508 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.733549 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.733558 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.733576 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.733587 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:04Z","lastTransitionTime":"2026-03-18T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.745967 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.779064 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.813669 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.835302 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.835339 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.835349 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.835393 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.835404 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:04Z","lastTransitionTime":"2026-03-18T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.855836 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.895486 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.937762 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.937813 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.937828 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.937848 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.937890 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:04Z","lastTransitionTime":"2026-03-18T12:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.941590 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:04 crc kubenswrapper[4975]: I0318 12:12:04.975521 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:04Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.016054 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.040207 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.040246 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.040257 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.040275 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.040285 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:05Z","lastTransitionTime":"2026-03-18T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.056754 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.093242 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.134829 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.142909 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.142939 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.142948 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.142996 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.143009 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:05Z","lastTransitionTime":"2026-03-18T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.174596 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.215068 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.245641 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.246198 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.246228 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.246247 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.246258 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:05Z","lastTransitionTime":"2026-03-18T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.253098 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.298898 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.337136 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.356398 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.356435 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.356445 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.356460 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.356472 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:05Z","lastTransitionTime":"2026-03-18T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.376132 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.420452 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.457600 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.459084 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.459126 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.459139 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.459158 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.459170 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:05Z","lastTransitionTime":"2026-03-18T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.503362 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.546403 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.561517 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.561552 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.561560 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.561575 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.561584 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:05Z","lastTransitionTime":"2026-03-18T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.606326 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.627298 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.656751 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.663540 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.663576 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.663584 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.663596 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.663611 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:05Z","lastTransitionTime":"2026-03-18T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.694285 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.736040 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.745519 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.745555 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.745565 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.745580 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.745589 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:05Z","lastTransitionTime":"2026-03-18T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:05 crc kubenswrapper[4975]: E0318 12:12:05.757494 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.761729 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.761761 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.761771 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.761784 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.761792 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:05Z","lastTransitionTime":"2026-03-18T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:05 crc kubenswrapper[4975]: E0318 12:12:05.773391 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.774405 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.777635 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.777672 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.777682 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.777699 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.777708 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:05Z","lastTransitionTime":"2026-03-18T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:05 crc kubenswrapper[4975]: E0318 12:12:05.792763 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.795681 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.795722 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.795734 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.795751 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.795762 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:05Z","lastTransitionTime":"2026-03-18T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:05 crc kubenswrapper[4975]: E0318 12:12:05.808046 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.812197 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.812243 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.812253 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.812268 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.812276 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:05Z","lastTransitionTime":"2026-03-18T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.815595 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: E0318 12:12:05.823479 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: E0318 12:12:05.823593 4975 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.824934 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.824972 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.824980 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.824994 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.825004 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:05Z","lastTransitionTime":"2026-03-18T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.852794 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.894269 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.926973 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.927009 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.927018 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.927031 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.927039 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:05Z","lastTransitionTime":"2026-03-18T12:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.934338 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:05 crc kubenswrapper[4975]: I0318 12:12:05.982572 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:05Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.015851 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.015971 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.015995 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:06 crc kubenswrapper[4975]: E0318 12:12:06.016150 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:06 crc kubenswrapper[4975]: E0318 12:12:06.016238 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:06 crc kubenswrapper[4975]: E0318 12:12:06.016512 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.020099 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.029117 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.029155 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.029168 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.029184 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.029195 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:06Z","lastTransitionTime":"2026-03-18T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.034145 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.075800 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.114852 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.132204 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.132247 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.132256 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.132272 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.132283 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:06Z","lastTransitionTime":"2026-03-18T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.161745 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.198850 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.233793 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.233826 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.233834 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.233847 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.233856 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:06Z","lastTransitionTime":"2026-03-18T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.236621 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.335751 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.335789 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.335799 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.335814 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.335829 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:06Z","lastTransitionTime":"2026-03-18T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.438179 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.438509 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.438598 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.438676 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.438741 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:06Z","lastTransitionTime":"2026-03-18T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.533429 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovnkube-controller/0.log" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.537977 4975 generic.go:334] "Generic (PLEG): container finished" podID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerID="16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91" exitCode=1 Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.538085 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerDied","Data":"16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91"} Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.538826 4975 scope.go:117] "RemoveContainer" containerID="16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.542607 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.542642 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.542688 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.542710 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.543021 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:06Z","lastTransitionTime":"2026-03-18T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.558011 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.572234 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.596507 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.614058 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.630328 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.640163 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.646726 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.646781 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.646796 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.646827 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.646898 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:06Z","lastTransitionTime":"2026-03-18T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.653128 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.667548 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.682447 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.693464 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.712040 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:06Z\\\",\\\"message\\\":\\\" 6779 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:06.160630 6779 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:06.161456 6779 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 12:12:06.161472 6779 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 12:12:06.161508 6779 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 12:12:06.161537 6779 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 12:12:06.161585 6779 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0318 12:12:06.161594 6779 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 12:12:06.161606 6779 factory.go:656] Stopping watch factory\\\\nI0318 12:12:06.161616 6779 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:12:06.161638 6779 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 12:12:06.161667 6779 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 12:12:06.161673 6779 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 12:12:06.161680 6779 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 12:12:06.161687 6779 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.722638 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.749881 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.749919 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.749931 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.749946 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.749957 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:06Z","lastTransitionTime":"2026-03-18T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.756196 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.795579 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.835814 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.852268 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.852331 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.852341 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.852357 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.852367 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:06Z","lastTransitionTime":"2026-03-18T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.874400 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:06Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.956168 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.956206 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.956221 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.956239 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:06 crc kubenswrapper[4975]: I0318 12:12:06.956250 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:06Z","lastTransitionTime":"2026-03-18T12:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.059718 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.059792 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.059808 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.059832 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.059849 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:07Z","lastTransitionTime":"2026-03-18T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.163309 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.163354 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.163366 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.163384 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.163396 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:07Z","lastTransitionTime":"2026-03-18T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.265783 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.265826 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.265837 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.265852 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.265890 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:07Z","lastTransitionTime":"2026-03-18T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.368818 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.368923 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.368938 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.368959 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.368973 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:07Z","lastTransitionTime":"2026-03-18T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.472501 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.472547 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.472556 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.472572 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.472581 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:07Z","lastTransitionTime":"2026-03-18T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.542901 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovnkube-controller/0.log" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.546166 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerStarted","Data":"dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720"} Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.546609 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.558270 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.571668 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.575052 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.575082 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.575091 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.575104 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.575116 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:07Z","lastTransitionTime":"2026-03-18T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.585724 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.609219 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk"] Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.609697 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.611920 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:06Z\\\",\\\"message\\\":\\\" 6779 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:06.160630 6779 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:06.161456 6779 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 12:12:06.161472 6779 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 12:12:06.161508 6779 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 12:12:06.161537 6779 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 12:12:06.161585 6779 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0318 12:12:06.161594 6779 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 12:12:06.161606 6779 factory.go:656] Stopping watch factory\\\\nI0318 12:12:06.161616 6779 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:12:06.161638 6779 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 12:12:06.161667 6779 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 12:12:06.161673 6779 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 12:12:06.161680 6779 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 12:12:06.161687 6779 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.612186 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.612385 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.630817 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.648020 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.663802 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.677615 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.677678 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.677690 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.677712 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.677731 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:07Z","lastTransitionTime":"2026-03-18T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.682318 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.705042 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.728115 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2af4c18e-c13c-45f8-b7cb-9ccfaa845a93-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7tcxk\" (UID: \"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.728181 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2af4c18e-c13c-45f8-b7cb-9ccfaa845a93-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7tcxk\" (UID: \"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.728425 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlfm7\" (UniqueName: \"kubernetes.io/projected/2af4c18e-c13c-45f8-b7cb-9ccfaa845a93-kube-api-access-rlfm7\") pod \"ovnkube-control-plane-749d76644c-7tcxk\" (UID: \"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.728485 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2af4c18e-c13c-45f8-b7cb-9ccfaa845a93-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7tcxk\" (UID: \"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.729949 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.746580 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.756474 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.769109 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.781041 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.781086 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.781096 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.781114 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.781127 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:07Z","lastTransitionTime":"2026-03-18T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.787214 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.797469 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.806304 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.816260 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.829729 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.829987 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlfm7\" (UniqueName: \"kubernetes.io/projected/2af4c18e-c13c-45f8-b7cb-9ccfaa845a93-kube-api-access-rlfm7\") pod \"ovnkube-control-plane-749d76644c-7tcxk\" (UID: \"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.830039 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2af4c18e-c13c-45f8-b7cb-9ccfaa845a93-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7tcxk\" (UID: \"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.830068 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2af4c18e-c13c-45f8-b7cb-9ccfaa845a93-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7tcxk\" (UID: \"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.830088 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2af4c18e-c13c-45f8-b7cb-9ccfaa845a93-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7tcxk\" (UID: \"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.830884 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2af4c18e-c13c-45f8-b7cb-9ccfaa845a93-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7tcxk\" (UID: \"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.830944 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2af4c18e-c13c-45f8-b7cb-9ccfaa845a93-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7tcxk\" (UID: \"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.835801 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2af4c18e-c13c-45f8-b7cb-9ccfaa845a93-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7tcxk\" (UID: \"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.845092 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.849262 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlfm7\" (UniqueName: \"kubernetes.io/projected/2af4c18e-c13c-45f8-b7cb-9ccfaa845a93-kube-api-access-rlfm7\") pod \"ovnkube-control-plane-749d76644c-7tcxk\" (UID: \"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.865381 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:06Z\\\",\\\"message\\\":\\\" 6779 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:06.160630 6779 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:06.161456 6779 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 12:12:06.161472 6779 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 12:12:06.161508 6779 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 12:12:06.161537 6779 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 12:12:06.161585 6779 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0318 12:12:06.161594 6779 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 12:12:06.161606 6779 factory.go:656] Stopping watch factory\\\\nI0318 12:12:06.161616 6779 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:12:06.161638 6779 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 12:12:06.161667 6779 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 12:12:06.161673 6779 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 12:12:06.161680 6779 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 12:12:06.161687 6779 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.879653 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.883664 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.883711 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.883722 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.883740 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.883751 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:07Z","lastTransitionTime":"2026-03-18T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.891765 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.904931 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.923724 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.925964 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.944495 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: W0318 12:12:07.948597 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2af4c18e_c13c_45f8_b7cb_9ccfaa845a93.slice/crio-d55a80b098f33dd79e8c978e78414dd7f830c941c4b1f827982d3cedf5ffc7c9 WatchSource:0}: Error finding container d55a80b098f33dd79e8c978e78414dd7f830c941c4b1f827982d3cedf5ffc7c9: Status 404 returned error can't find the container with id d55a80b098f33dd79e8c978e78414dd7f830c941c4b1f827982d3cedf5ffc7c9 Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.976711 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:07Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.986461 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.986500 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.986508 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.986523 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:07 crc kubenswrapper[4975]: I0318 12:12:07.986535 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:07Z","lastTransitionTime":"2026-03-18T12:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.016453 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.016522 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:08 crc kubenswrapper[4975]: E0318 12:12:08.016881 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:08 crc kubenswrapper[4975]: E0318 12:12:08.016995 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.017077 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:08 crc kubenswrapper[4975]: E0318 12:12:08.017221 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.017211 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.055365 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.090478 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.090523 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.090535 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.090552 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.090564 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:08Z","lastTransitionTime":"2026-03-18T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.097402 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.144117 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.173202 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.192817 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.192890 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.192904 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.192922 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.192934 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:08Z","lastTransitionTime":"2026-03-18T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.213268 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.253012 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.296371 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.297155 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.297178 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.297205 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.297219 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:08Z","lastTransitionTime":"2026-03-18T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.329149 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-587nk"] Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.329612 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:08 crc kubenswrapper[4975]: E0318 12:12:08.329668 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.341806 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.362903 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.393366 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:06Z\\\",\\\"message\\\":\\\" 6779 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:06.160630 6779 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:06.161456 6779 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 12:12:06.161472 6779 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 12:12:06.161508 6779 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 12:12:06.161537 6779 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 12:12:06.161585 6779 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0318 12:12:06.161594 6779 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 12:12:06.161606 6779 factory.go:656] Stopping watch factory\\\\nI0318 12:12:06.161616 6779 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:12:06.161638 6779 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 12:12:06.161667 6779 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 12:12:06.161673 6779 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 12:12:06.161680 6779 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 12:12:06.161687 6779 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.400496 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.400543 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.400553 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.400571 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.400582 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:08Z","lastTransitionTime":"2026-03-18T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.414961 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.436466 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbv27\" (UniqueName: \"kubernetes.io/projected/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-kube-api-access-fbv27\") pod \"network-metrics-daemon-587nk\" (UID: \"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\") " pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.436544 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs\") pod \"network-metrics-daemon-587nk\" (UID: \"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\") " pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.456752 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.495551 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.503618 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.503669 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.503680 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.503703 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.503718 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:08Z","lastTransitionTime":"2026-03-18T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.533727 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.537338 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbv27\" (UniqueName: \"kubernetes.io/projected/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-kube-api-access-fbv27\") pod \"network-metrics-daemon-587nk\" (UID: \"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\") " pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.537428 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs\") pod \"network-metrics-daemon-587nk\" (UID: \"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\") " pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:08 crc kubenswrapper[4975]: E0318 12:12:08.537639 4975 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:08 crc kubenswrapper[4975]: E0318 12:12:08.537735 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs podName:a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:09.037712094 +0000 UTC m=+114.752112673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs") pod "network-metrics-daemon-587nk" (UID: "a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.551637 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" event={"ID":"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93","Type":"ContainerStarted","Data":"8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac"} Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.551703 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" event={"ID":"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93","Type":"ContainerStarted","Data":"e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce"} Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.551722 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" event={"ID":"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93","Type":"ContainerStarted","Data":"d55a80b098f33dd79e8c978e78414dd7f830c941c4b1f827982d3cedf5ffc7c9"} Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.553320 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovnkube-controller/1.log" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.553968 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovnkube-controller/0.log" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.556500 4975 generic.go:334] "Generic (PLEG): container finished" podID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerID="dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720" exitCode=1 Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.556545 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerDied","Data":"dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720"} Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.556595 4975 scope.go:117] "RemoveContainer" containerID="16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.557188 4975 scope.go:117] "RemoveContainer" containerID="dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720" Mar 18 12:12:08 crc kubenswrapper[4975]: E0318 12:12:08.557410 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.585093 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbv27\" (UniqueName: \"kubernetes.io/projected/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-kube-api-access-fbv27\") pod \"network-metrics-daemon-587nk\" (UID: \"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\") " pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.596411 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.606360 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.606412 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.606420 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.606436 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.606446 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:08Z","lastTransitionTime":"2026-03-18T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.634893 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.677393 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.709616 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.709661 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.709695 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.709710 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.709722 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:08Z","lastTransitionTime":"2026-03-18T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.717740 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.762502 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.796664 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.812220 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.812265 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.812278 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.812294 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.812305 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:08Z","lastTransitionTime":"2026-03-18T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.836386 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.873024 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.912966 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.914341 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.914387 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.914397 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.914412 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.914422 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:08Z","lastTransitionTime":"2026-03-18T12:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.956117 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:08 crc kubenswrapper[4975]: I0318 12:12:08.997169 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:08Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.018932 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.018996 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.019009 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.019030 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.019046 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:09Z","lastTransitionTime":"2026-03-18T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.039962 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.043203 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs\") pod \"network-metrics-daemon-587nk\" (UID: \"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\") " pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:09 crc kubenswrapper[4975]: E0318 12:12:09.043340 4975 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:09 crc kubenswrapper[4975]: E0318 12:12:09.043402 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs podName:a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:10.043386882 +0000 UTC m=+115.757787461 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs") pod "network-metrics-daemon-587nk" (UID: "a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.074584 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.114745 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.121635 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.121679 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.121687 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.121701 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.121710 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:09Z","lastTransitionTime":"2026-03-18T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.154166 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.193262 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.224389 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.224634 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.224647 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.224665 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.224678 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:09Z","lastTransitionTime":"2026-03-18T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.234492 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.274037 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.313979 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.327292 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.327330 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.327342 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.327358 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.327369 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:09Z","lastTransitionTime":"2026-03-18T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.353554 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.393784 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.430549 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.430603 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.430613 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.430638 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.430652 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:09Z","lastTransitionTime":"2026-03-18T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.435607 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.476108 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.514919 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.533400 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.533432 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.533440 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.533454 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.533462 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:09Z","lastTransitionTime":"2026-03-18T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.561195 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovnkube-controller/1.log" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.562432 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16d9f5fc862f5e0708e8791049ad3f9df5c5eb709dbc15fc00f0f661bdbe1e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:06Z\\\",\\\"message\\\":\\\" 6779 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:06.160630 6779 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:06.161456 6779 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0318 12:12:06.161472 6779 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0318 12:12:06.161508 6779 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0318 12:12:06.161537 6779 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0318 12:12:06.161585 6779 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0318 12:12:06.161594 6779 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0318 12:12:06.161606 6779 factory.go:656] Stopping watch factory\\\\nI0318 12:12:06.161616 6779 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:12:06.161638 6779 handler.go:208] Removed *v1.Node event handler 2\\\\nI0318 12:12:06.161667 6779 handler.go:208] Removed *v1.Node event handler 7\\\\nI0318 12:12:06.161673 6779 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0318 12:12:06.161680 6779 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0318 12:12:06.161687 6779 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0318 12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"message\\\":\\\"Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 12:12:07.932656 6925 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0318 12:12:07.932323 6925 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.564695 4975 scope.go:117] "RemoveContainer" containerID="dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720" Mar 18 12:12:09 crc kubenswrapper[4975]: E0318 12:12:09.565123 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.594604 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.634748 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.635723 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.635762 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.635772 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.635785 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.635796 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:09Z","lastTransitionTime":"2026-03-18T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.676159 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.713308 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.738400 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.738453 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.738473 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.738492 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.738504 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:09Z","lastTransitionTime":"2026-03-18T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.754895 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.795023 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.834673 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.841202 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.841251 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.841263 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.841281 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.841292 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:09Z","lastTransitionTime":"2026-03-18T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.873953 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.912324 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.943747 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.943794 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.943805 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.943822 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.943834 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:09Z","lastTransitionTime":"2026-03-18T12:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.954347 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:09 crc kubenswrapper[4975]: I0318 12:12:09.993626 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:09Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.016415 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.016439 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.016515 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.016415 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:10 crc kubenswrapper[4975]: E0318 12:12:10.016542 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:10 crc kubenswrapper[4975]: E0318 12:12:10.016609 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:10 crc kubenswrapper[4975]: E0318 12:12:10.016689 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:10 crc kubenswrapper[4975]: E0318 12:12:10.016752 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.034995 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.045435 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.045471 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.045484 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.045501 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.045513 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:10Z","lastTransitionTime":"2026-03-18T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.053220 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs\") pod \"network-metrics-daemon-587nk\" (UID: \"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\") " pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:10 crc kubenswrapper[4975]: E0318 12:12:10.053343 4975 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:10 crc kubenswrapper[4975]: E0318 12:12:10.053439 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs podName:a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:12.053422847 +0000 UTC m=+117.767823426 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs") pod "network-metrics-daemon-587nk" (UID: "a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.075574 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.120566 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"message\\\":\\\"Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 12:12:07.932656 6925 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0318 12:12:07.932323 6925 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.147285 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.147326 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.147337 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.147352 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.147363 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:10Z","lastTransitionTime":"2026-03-18T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.155221 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.194027 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.233911 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.249093 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.249139 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.249150 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.249168 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.249180 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:10Z","lastTransitionTime":"2026-03-18T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.273569 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.340567 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.351832 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.351890 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.351899 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.351914 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.351926 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:10Z","lastTransitionTime":"2026-03-18T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.365659 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.396264 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.438156 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:10Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.454264 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.454304 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.454337 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.454354 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.454380 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:10Z","lastTransitionTime":"2026-03-18T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.556795 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.556854 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.556882 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.556897 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.556905 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:10Z","lastTransitionTime":"2026-03-18T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.659689 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.659735 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.659746 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.659763 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.659773 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:10Z","lastTransitionTime":"2026-03-18T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.762526 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.762561 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.762572 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.762588 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.762599 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:10Z","lastTransitionTime":"2026-03-18T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.864777 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.864816 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.864825 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.864838 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.864847 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:10Z","lastTransitionTime":"2026-03-18T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.968046 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.968120 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.968132 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.968150 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:10 crc kubenswrapper[4975]: I0318 12:12:10.968166 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:10Z","lastTransitionTime":"2026-03-18T12:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.016546 4975 scope.go:117] "RemoveContainer" containerID="44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.070530 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.070566 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.070579 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.070594 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.070605 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:11Z","lastTransitionTime":"2026-03-18T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.172694 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.172744 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.172757 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.172776 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.172789 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:11Z","lastTransitionTime":"2026-03-18T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.278425 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.278510 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.278521 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.278537 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.278549 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:11Z","lastTransitionTime":"2026-03-18T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.380187 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.380264 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.380289 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.380317 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.380338 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:11Z","lastTransitionTime":"2026-03-18T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.482568 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.482611 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.482621 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.482636 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.482647 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:11Z","lastTransitionTime":"2026-03-18T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.576600 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.578394 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e"} Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.579252 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.584792 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.584831 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.584839 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.584852 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.584877 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:11Z","lastTransitionTime":"2026-03-18T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.590081 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.600117 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.611154 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.625462 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.637982 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.648906 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.667804 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"message\\\":\\\"Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 12:12:07.932656 6925 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0318 12:12:07.932323 6925 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.677600 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.687484 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.687522 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.687532 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.687547 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.687557 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:11Z","lastTransitionTime":"2026-03-18T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.691343 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.701722 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.710972 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.721435 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.733685 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.747963 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.760494 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.775417 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.789839 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.789911 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.789924 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.789940 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.789950 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:11Z","lastTransitionTime":"2026-03-18T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.794373 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.807600 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:11Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.892628 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.892671 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.892682 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.892697 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.892708 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:11Z","lastTransitionTime":"2026-03-18T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.994989 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.995045 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.995056 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.995072 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:11 crc kubenswrapper[4975]: I0318 12:12:11.995083 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:11Z","lastTransitionTime":"2026-03-18T12:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.016351 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.016394 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.016407 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:12 crc kubenswrapper[4975]: E0318 12:12:12.016500 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.016587 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:12 crc kubenswrapper[4975]: E0318 12:12:12.016712 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:12 crc kubenswrapper[4975]: E0318 12:12:12.016739 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:12 crc kubenswrapper[4975]: E0318 12:12:12.016803 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.078225 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs\") pod \"network-metrics-daemon-587nk\" (UID: \"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\") " pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:12 crc kubenswrapper[4975]: E0318 12:12:12.078396 4975 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:12 crc kubenswrapper[4975]: E0318 12:12:12.078503 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs podName:a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:16.078479353 +0000 UTC m=+121.792880012 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs") pod "network-metrics-daemon-587nk" (UID: "a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.097736 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.097785 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.097799 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.097820 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.097836 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:12Z","lastTransitionTime":"2026-03-18T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.200901 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.200984 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.201009 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.201040 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.201062 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:12Z","lastTransitionTime":"2026-03-18T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.303583 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.303631 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.303643 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.303660 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.303670 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:12Z","lastTransitionTime":"2026-03-18T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.405398 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.405441 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.405451 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.405466 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.405476 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:12Z","lastTransitionTime":"2026-03-18T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.508189 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.508229 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.508240 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.508254 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.508265 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:12Z","lastTransitionTime":"2026-03-18T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.610359 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.610417 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.610432 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.610449 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.610465 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:12Z","lastTransitionTime":"2026-03-18T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.712241 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.712282 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.712294 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.712307 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.712316 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:12Z","lastTransitionTime":"2026-03-18T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.814596 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.814664 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.814682 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.814699 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.814712 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:12Z","lastTransitionTime":"2026-03-18T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.917152 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.917204 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.917215 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.917233 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:12 crc kubenswrapper[4975]: I0318 12:12:12.917245 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:12Z","lastTransitionTime":"2026-03-18T12:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.019657 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.019686 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.019695 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.019714 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.019726 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:13Z","lastTransitionTime":"2026-03-18T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.122142 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.122196 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.122208 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.122223 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.122232 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:13Z","lastTransitionTime":"2026-03-18T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.225021 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.225079 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.225087 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.225106 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.225115 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:13Z","lastTransitionTime":"2026-03-18T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.327954 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.327990 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.328000 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.328016 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.328026 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:13Z","lastTransitionTime":"2026-03-18T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.430651 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.431041 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.431055 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.431072 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.431086 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:13Z","lastTransitionTime":"2026-03-18T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.533367 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.533410 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.533422 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.533436 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.533450 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:13Z","lastTransitionTime":"2026-03-18T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.635372 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.635419 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.635431 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.635446 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.635458 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:13Z","lastTransitionTime":"2026-03-18T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.737298 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.737347 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.737358 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.737374 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.737386 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:13Z","lastTransitionTime":"2026-03-18T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.839975 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.840015 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.840026 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.840042 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.840054 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:13Z","lastTransitionTime":"2026-03-18T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.942246 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.942280 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.942288 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.942301 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:13 crc kubenswrapper[4975]: I0318 12:12:13.942310 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:13Z","lastTransitionTime":"2026-03-18T12:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.016166 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:14 crc kubenswrapper[4975]: E0318 12:12:14.016322 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.016590 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:14 crc kubenswrapper[4975]: E0318 12:12:14.016669 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.016750 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.016748 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:14 crc kubenswrapper[4975]: E0318 12:12:14.016892 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:14 crc kubenswrapper[4975]: E0318 12:12:14.016990 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.044343 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.044393 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.044403 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.044416 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.044427 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:14Z","lastTransitionTime":"2026-03-18T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.146584 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.146621 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.146634 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.146655 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.146671 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:14Z","lastTransitionTime":"2026-03-18T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.250591 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.250653 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.250666 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.250684 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.250696 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:14Z","lastTransitionTime":"2026-03-18T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.353252 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.353304 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.353321 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.353343 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.353359 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:14Z","lastTransitionTime":"2026-03-18T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.455854 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.455972 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.455993 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.456018 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.456034 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:14Z","lastTransitionTime":"2026-03-18T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.558970 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.559027 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.559060 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.559076 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.559087 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:14Z","lastTransitionTime":"2026-03-18T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.661260 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.661305 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.661316 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.661331 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.661342 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:14Z","lastTransitionTime":"2026-03-18T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.764192 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.764236 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.764246 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.764260 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.764270 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:14Z","lastTransitionTime":"2026-03-18T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.866118 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.866158 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.866166 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.866179 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:14 crc kubenswrapper[4975]: I0318 12:12:14.866188 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:14Z","lastTransitionTime":"2026-03-18T12:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:14 crc kubenswrapper[4975]: E0318 12:12:14.966435 4975 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.027536 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.038559 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.053222 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.072281 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.085625 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.096417 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.108315 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: E0318 12:12:15.120323 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.120928 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.135488 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.149683 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.162344 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.188805 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"message\\\":\\\"Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 12:12:07.932656 6925 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0318 12:12:07.932323 6925 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.200498 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.213701 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.227941 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.238783 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.254719 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.267387 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.924039 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.924084 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.924098 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.924115 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.924128 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:15Z","lastTransitionTime":"2026-03-18T12:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:15 crc kubenswrapper[4975]: E0318 12:12:15.937091 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.940554 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.940587 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.940599 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.940613 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.940623 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:15Z","lastTransitionTime":"2026-03-18T12:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:15 crc kubenswrapper[4975]: E0318 12:12:15.953446 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.957592 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.957655 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.957667 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.957686 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.957698 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:15Z","lastTransitionTime":"2026-03-18T12:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:15 crc kubenswrapper[4975]: E0318 12:12:15.968805 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.972300 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.972334 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.972342 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.972355 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.972366 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:15Z","lastTransitionTime":"2026-03-18T12:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:15 crc kubenswrapper[4975]: E0318 12:12:15.984171 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:15Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.987892 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.987948 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.987959 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.987974 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:15 crc kubenswrapper[4975]: I0318 12:12:15.987984 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:15Z","lastTransitionTime":"2026-03-18T12:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:16 crc kubenswrapper[4975]: E0318 12:12:16.005135 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:16Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:16 crc kubenswrapper[4975]: E0318 12:12:16.005282 4975 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:12:16 crc kubenswrapper[4975]: I0318 12:12:16.016631 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:16 crc kubenswrapper[4975]: I0318 12:12:16.016663 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:16 crc kubenswrapper[4975]: I0318 12:12:16.016690 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:16 crc kubenswrapper[4975]: I0318 12:12:16.016699 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:16 crc kubenswrapper[4975]: E0318 12:12:16.016759 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:16 crc kubenswrapper[4975]: E0318 12:12:16.017028 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:16 crc kubenswrapper[4975]: E0318 12:12:16.017131 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:16 crc kubenswrapper[4975]: E0318 12:12:16.017191 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:16 crc kubenswrapper[4975]: I0318 12:12:16.123232 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs\") pod \"network-metrics-daemon-587nk\" (UID: \"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\") " pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:16 crc kubenswrapper[4975]: E0318 12:12:16.123454 4975 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:16 crc kubenswrapper[4975]: E0318 12:12:16.123535 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs podName:a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:24.123512914 +0000 UTC m=+129.837913533 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs") pod "network-metrics-daemon-587nk" (UID: "a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:18 crc kubenswrapper[4975]: I0318 12:12:18.015906 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:18 crc kubenswrapper[4975]: I0318 12:12:18.015919 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:18 crc kubenswrapper[4975]: E0318 12:12:18.016049 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:18 crc kubenswrapper[4975]: I0318 12:12:18.016114 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:18 crc kubenswrapper[4975]: I0318 12:12:18.015919 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:18 crc kubenswrapper[4975]: E0318 12:12:18.016193 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:18 crc kubenswrapper[4975]: E0318 12:12:18.017266 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:18 crc kubenswrapper[4975]: E0318 12:12:18.017360 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:19 crc kubenswrapper[4975]: I0318 12:12:19.030302 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.015828 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.015834 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.015962 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:20 crc kubenswrapper[4975]: E0318 12:12:20.016071 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.016621 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:20 crc kubenswrapper[4975]: E0318 12:12:20.016715 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:20 crc kubenswrapper[4975]: E0318 12:12:20.016822 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:20 crc kubenswrapper[4975]: E0318 12:12:20.016965 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.017679 4975 scope.go:117] "RemoveContainer" containerID="dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720" Mar 18 12:12:20 crc kubenswrapper[4975]: E0318 12:12:20.121644 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.607104 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovnkube-controller/1.log" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.609662 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerStarted","Data":"1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474"} Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.610141 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.620997 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.634527 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.657081 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.670699 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.683154 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.693103 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.703136 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.714660 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.727463 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.745776 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"message\\\":\\\"Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 12:12:07.932656 6925 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0318 12:12:07.932323 6925 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.757803 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.771569 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.784444 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.796770 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.817163 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.844348 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.857169 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56efd9a4-4479-4621-b2d4-572ba3a4e7cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6baed74837a87ab6340d3036ba532bab8aabfe3c751e3af2a483308255eb1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:10:17.099277 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:10:17.101321 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:10:17.148684 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:10:17.153358 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 12:10:47.611677 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a28602986a572f19d3fe21e7033ff58efe255822d5bfd421c86004edb5693d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fffbb26bf332dce194e9fe134fc0e9eccf21c6c833e26e4c7aaaed7d211ae0cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.869546 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:20 crc kubenswrapper[4975]: I0318 12:12:20.881319 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:20Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.614695 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovnkube-controller/2.log" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.615912 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovnkube-controller/1.log" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.618568 4975 generic.go:334] "Generic (PLEG): container finished" podID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerID="1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474" exitCode=1 Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.618656 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerDied","Data":"1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474"} Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.618979 4975 scope.go:117] "RemoveContainer" containerID="dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.619376 4975 scope.go:117] "RemoveContainer" containerID="1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474" Mar 18 12:12:21 crc kubenswrapper[4975]: E0318 12:12:21.619562 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.632490 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.644743 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.654684 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.668671 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.681541 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56efd9a4-4479-4621-b2d4-572ba3a4e7cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6baed74837a87ab6340d3036ba532bab8aabfe3c751e3af2a483308255eb1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:10:17.099277 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:10:17.101321 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:10:17.148684 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:10:17.153358 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 12:10:47.611677 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a28602986a572f19d3fe21e7033ff58efe255822d5bfd421c86004edb5693d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fffbb26bf332dce194e9fe134fc0e9eccf21c6c833e26e4c7aaaed7d211ae0cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.692027 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.706224 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.720005 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.746759 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.759972 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.771448 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.781919 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.792767 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.803035 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.816329 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.828805 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.843096 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.860403 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd5c31f80f8c0479402324657c407465b315408f7c4949887d56d8629e961720\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"message\\\":\\\"Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0318 12:12:07.932656 6925 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0318 12:12:07.932323 6925 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007508b07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-apiserver-operator,},ClusterIP:10.217.4.38,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.38],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0318 12:12:20.859074 7184 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:21 crc kubenswrapper[4975]: I0318 12:12:21.871274 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:21Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.015623 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.015716 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.015770 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:22 crc kubenswrapper[4975]: E0318 12:12:22.015914 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:22 crc kubenswrapper[4975]: E0318 12:12:22.016009 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:22 crc kubenswrapper[4975]: E0318 12:12:22.016128 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.016292 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:22 crc kubenswrapper[4975]: E0318 12:12:22.016396 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.622838 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovnkube-controller/2.log" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.625957 4975 scope.go:117] "RemoveContainer" containerID="1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474" Mar 18 12:12:22 crc kubenswrapper[4975]: E0318 12:12:22.626142 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.638119 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.660178 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.674161 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.685651 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.696385 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.705889 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.716048 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.728452 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.740228 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.757005 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007508b07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-apiserver-operator,},ClusterIP:10.217.4.38,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.38],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0318 12:12:20.859074 7184 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.767507 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.779164 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.791002 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.802643 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.817167 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.839050 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.854382 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56efd9a4-4479-4621-b2d4-572ba3a4e7cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6baed74837a87ab6340d3036ba532bab8aabfe3c751e3af2a483308255eb1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:10:17.099277 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:10:17.101321 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:10:17.148684 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:10:17.153358 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 12:10:47.611677 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a28602986a572f19d3fe21e7033ff58efe255822d5bfd421c86004edb5693d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fffbb26bf332dce194e9fe134fc0e9eccf21c6c833e26e4c7aaaed7d211ae0cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.865486 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:22 crc kubenswrapper[4975]: I0318 12:12:22.877412 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:22Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:24 crc kubenswrapper[4975]: I0318 12:12:24.016382 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:24 crc kubenswrapper[4975]: I0318 12:12:24.016486 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:24 crc kubenswrapper[4975]: E0318 12:12:24.016607 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:24 crc kubenswrapper[4975]: I0318 12:12:24.016512 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:24 crc kubenswrapper[4975]: E0318 12:12:24.016652 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:24 crc kubenswrapper[4975]: I0318 12:12:24.016492 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:24 crc kubenswrapper[4975]: E0318 12:12:24.016741 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:24 crc kubenswrapper[4975]: E0318 12:12:24.016804 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:24 crc kubenswrapper[4975]: I0318 12:12:24.200029 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs\") pod \"network-metrics-daemon-587nk\" (UID: \"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\") " pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:24 crc kubenswrapper[4975]: E0318 12:12:24.200174 4975 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:24 crc kubenswrapper[4975]: E0318 12:12:24.200242 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs podName:a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b nodeName:}" failed. No retries permitted until 2026-03-18 12:12:40.200225986 +0000 UTC m=+145.914626565 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs") pod "network-metrics-daemon-587nk" (UID: "a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.028220 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.039685 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.051917 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.060487 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.069212 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.079095 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.088522 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.098191 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.109486 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: E0318 12:12:25.124360 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.132236 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007508b07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-apiserver-operator,},ClusterIP:10.217.4.38,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.38],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0318 12:12:20.859074 7184 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.146996 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.158993 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.170776 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.192529 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.223604 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.240683 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56efd9a4-4479-4621-b2d4-572ba3a4e7cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6baed74837a87ab6340d3036ba532bab8aabfe3c751e3af2a483308255eb1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:10:17.099277 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:10:17.101321 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:10:17.148684 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:10:17.153358 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 12:10:47.611677 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a28602986a572f19d3fe21e7033ff58efe255822d5bfd421c86004edb5693d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fffbb26bf332dce194e9fe134fc0e9eccf21c6c833e26e4c7aaaed7d211ae0cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.251514 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.261607 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:25 crc kubenswrapper[4975]: I0318 12:12:25.273400 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:25Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.015743 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.015824 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:26 crc kubenswrapper[4975]: E0318 12:12:26.015881 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.015905 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.015937 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:26 crc kubenswrapper[4975]: E0318 12:12:26.016011 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:26 crc kubenswrapper[4975]: E0318 12:12:26.015969 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:26 crc kubenswrapper[4975]: E0318 12:12:26.016114 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.278545 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.278586 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.278597 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.278612 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.278622 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:26Z","lastTransitionTime":"2026-03-18T12:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:26 crc kubenswrapper[4975]: E0318 12:12:26.290724 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.295353 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.295390 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.295401 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.295417 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.295429 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:26Z","lastTransitionTime":"2026-03-18T12:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:26 crc kubenswrapper[4975]: E0318 12:12:26.306954 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.309992 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.310034 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.310044 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.310059 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.310068 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:26Z","lastTransitionTime":"2026-03-18T12:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:26 crc kubenswrapper[4975]: E0318 12:12:26.329675 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.334050 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.334113 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.334143 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.334166 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.334175 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:26Z","lastTransitionTime":"2026-03-18T12:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:26 crc kubenswrapper[4975]: E0318 12:12:26.351216 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.354548 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.354598 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.354612 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.354627 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:26 crc kubenswrapper[4975]: I0318 12:12:26.354639 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:26Z","lastTransitionTime":"2026-03-18T12:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:26 crc kubenswrapper[4975]: E0318 12:12:26.373292 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:26Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:26 crc kubenswrapper[4975]: E0318 12:12:26.373568 4975 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:12:28 crc kubenswrapper[4975]: I0318 12:12:28.015596 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:28 crc kubenswrapper[4975]: I0318 12:12:28.015612 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:28 crc kubenswrapper[4975]: I0318 12:12:28.015652 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:28 crc kubenswrapper[4975]: I0318 12:12:28.015709 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:28 crc kubenswrapper[4975]: E0318 12:12:28.016811 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:28 crc kubenswrapper[4975]: E0318 12:12:28.016691 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:28 crc kubenswrapper[4975]: E0318 12:12:28.016899 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:28 crc kubenswrapper[4975]: E0318 12:12:28.016576 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.015409 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.015440 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:30 crc kubenswrapper[4975]: E0318 12:12:30.015553 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.015577 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:30 crc kubenswrapper[4975]: E0318 12:12:30.015699 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.015716 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:30 crc kubenswrapper[4975]: E0318 12:12:30.015780 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:30 crc kubenswrapper[4975]: E0318 12:12:30.015886 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.115096 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:12:30 crc kubenswrapper[4975]: E0318 12:12:30.125465 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.129163 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.143559 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.155431 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.166922 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.177714 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.189638 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.202005 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.216000 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.235108 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007508b07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-apiserver-operator,},ClusterIP:10.217.4.38,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.38],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0318 12:12:20.859074 7184 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.245665 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.258492 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.271027 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.283470 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.295442 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.307219 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56efd9a4-4479-4621-b2d4-572ba3a4e7cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6baed74837a87ab6340d3036ba532bab8aabfe3c751e3af2a483308255eb1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:10:17.099277 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:10:17.101321 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:10:17.148684 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:10:17.153358 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 12:10:47.611677 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a28602986a572f19d3fe21e7033ff58efe255822d5bfd421c86004edb5693d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fffbb26bf332dce194e9fe134fc0e9eccf21c6c833e26e4c7aaaed7d211ae0cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.318141 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.330779 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.344463 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:30 crc kubenswrapper[4975]: I0318 12:12:30.363449 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:30Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:32 crc kubenswrapper[4975]: I0318 12:12:32.015898 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:32 crc kubenswrapper[4975]: I0318 12:12:32.015985 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:32 crc kubenswrapper[4975]: I0318 12:12:32.015919 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:32 crc kubenswrapper[4975]: E0318 12:12:32.016129 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:32 crc kubenswrapper[4975]: I0318 12:12:32.016159 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:32 crc kubenswrapper[4975]: E0318 12:12:32.016264 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:32 crc kubenswrapper[4975]: E0318 12:12:32.016376 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:32 crc kubenswrapper[4975]: E0318 12:12:32.016540 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:33 crc kubenswrapper[4975]: I0318 12:12:33.892140 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:12:33 crc kubenswrapper[4975]: I0318 12:12:33.892248 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:33 crc kubenswrapper[4975]: I0318 12:12:33.892275 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:33 crc kubenswrapper[4975]: E0318 12:12:33.892460 4975 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:12:33 crc kubenswrapper[4975]: E0318 12:12:33.892459 4975 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:12:33 crc kubenswrapper[4975]: E0318 12:12:33.892468 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:37.89242221 +0000 UTC m=+203.606822789 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:12:33 crc kubenswrapper[4975]: E0318 12:12:33.892601 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:13:37.892568354 +0000 UTC m=+203.606969153 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:12:33 crc kubenswrapper[4975]: E0318 12:12:33.892620 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:13:37.892611645 +0000 UTC m=+203.607012464 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:12:33 crc kubenswrapper[4975]: I0318 12:12:33.993639 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:33 crc kubenswrapper[4975]: E0318 12:12:33.993744 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:12:33 crc kubenswrapper[4975]: E0318 12:12:33.993766 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:12:33 crc kubenswrapper[4975]: E0318 12:12:33.993776 4975 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:12:33 crc kubenswrapper[4975]: I0318 12:12:33.993855 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:33 crc kubenswrapper[4975]: E0318 12:12:33.993992 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:13:37.993971514 +0000 UTC m=+203.708372173 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:12:33 crc kubenswrapper[4975]: E0318 12:12:33.994023 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:12:33 crc kubenswrapper[4975]: E0318 12:12:33.994045 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:12:33 crc kubenswrapper[4975]: E0318 12:12:33.994058 4975 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:12:33 crc kubenswrapper[4975]: E0318 12:12:33.994101 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:13:37.994093327 +0000 UTC m=+203.708493906 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:12:34 crc kubenswrapper[4975]: I0318 12:12:34.015731 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:34 crc kubenswrapper[4975]: I0318 12:12:34.015794 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:34 crc kubenswrapper[4975]: E0318 12:12:34.015884 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:34 crc kubenswrapper[4975]: I0318 12:12:34.015919 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:34 crc kubenswrapper[4975]: I0318 12:12:34.015755 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:34 crc kubenswrapper[4975]: E0318 12:12:34.016002 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:34 crc kubenswrapper[4975]: E0318 12:12:34.016268 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:34 crc kubenswrapper[4975]: E0318 12:12:34.016415 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:34 crc kubenswrapper[4975]: I0318 12:12:34.016594 4975 scope.go:117] "RemoveContainer" containerID="1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474" Mar 18 12:12:34 crc kubenswrapper[4975]: E0318 12:12:34.016766 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.028030 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.043297 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.054993 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.064773 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.074193 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.084485 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.093345 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.106993 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.116576 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: E0318 12:12:35.126173 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.137355 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007508b07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-apiserver-operator,},ClusterIP:10.217.4.38,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.38],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0318 12:12:20.859074 7184 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.151664 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.163749 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.176166 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.186809 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.208232 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.220989 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56efd9a4-4479-4621-b2d4-572ba3a4e7cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6baed74837a87ab6340d3036ba532bab8aabfe3c751e3af2a483308255eb1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:10:17.099277 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:10:17.101321 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:10:17.148684 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:10:17.153358 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 12:10:47.611677 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a28602986a572f19d3fe21e7033ff58efe255822d5bfd421c86004edb5693d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fffbb26bf332dce194e9fe134fc0e9eccf21c6c833e26e4c7aaaed7d211ae0cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.232416 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.244353 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:35 crc kubenswrapper[4975]: I0318 12:12:35.257513 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:35Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.015744 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.015816 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.015766 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.015766 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:36 crc kubenswrapper[4975]: E0318 12:12:36.015893 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:36 crc kubenswrapper[4975]: E0318 12:12:36.015933 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:36 crc kubenswrapper[4975]: E0318 12:12:36.015983 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:36 crc kubenswrapper[4975]: E0318 12:12:36.016093 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.677724 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.677792 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.677804 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.677827 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.677842 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:36Z","lastTransitionTime":"2026-03-18T12:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:36 crc kubenswrapper[4975]: E0318 12:12:36.693465 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:36Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.697971 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.698020 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.698034 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.698051 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.698063 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:36Z","lastTransitionTime":"2026-03-18T12:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:36 crc kubenswrapper[4975]: E0318 12:12:36.715634 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:36Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.719790 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.719842 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.719850 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.719906 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.719920 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:36Z","lastTransitionTime":"2026-03-18T12:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:36 crc kubenswrapper[4975]: E0318 12:12:36.732479 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:36Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.736265 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.736363 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.736381 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.736401 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.736415 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:36Z","lastTransitionTime":"2026-03-18T12:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:36 crc kubenswrapper[4975]: E0318 12:12:36.783953 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:36Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.791244 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.791307 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.791320 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.791343 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:36 crc kubenswrapper[4975]: I0318 12:12:36.791356 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:36Z","lastTransitionTime":"2026-03-18T12:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:36 crc kubenswrapper[4975]: E0318 12:12:36.806829 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:36Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:36 crc kubenswrapper[4975]: E0318 12:12:36.807019 4975 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:12:38 crc kubenswrapper[4975]: I0318 12:12:38.016294 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:38 crc kubenswrapper[4975]: E0318 12:12:38.016460 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:38 crc kubenswrapper[4975]: I0318 12:12:38.016294 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:38 crc kubenswrapper[4975]: I0318 12:12:38.016337 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:38 crc kubenswrapper[4975]: I0318 12:12:38.016319 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:38 crc kubenswrapper[4975]: E0318 12:12:38.016635 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:38 crc kubenswrapper[4975]: E0318 12:12:38.016697 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:38 crc kubenswrapper[4975]: E0318 12:12:38.016544 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:40 crc kubenswrapper[4975]: I0318 12:12:40.015750 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:40 crc kubenswrapper[4975]: I0318 12:12:40.015750 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:40 crc kubenswrapper[4975]: I0318 12:12:40.015770 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:40 crc kubenswrapper[4975]: E0318 12:12:40.015923 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:40 crc kubenswrapper[4975]: I0318 12:12:40.016042 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:40 crc kubenswrapper[4975]: E0318 12:12:40.016231 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:40 crc kubenswrapper[4975]: E0318 12:12:40.016337 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:40 crc kubenswrapper[4975]: E0318 12:12:40.016434 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:40 crc kubenswrapper[4975]: E0318 12:12:40.127334 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:40 crc kubenswrapper[4975]: I0318 12:12:40.248195 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs\") pod \"network-metrics-daemon-587nk\" (UID: \"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\") " pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:40 crc kubenswrapper[4975]: E0318 12:12:40.248416 4975 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:40 crc kubenswrapper[4975]: E0318 12:12:40.249005 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs podName:a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:12.24897038 +0000 UTC m=+177.963370999 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs") pod "network-metrics-daemon-587nk" (UID: "a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:12:42 crc kubenswrapper[4975]: I0318 12:12:42.015591 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:42 crc kubenswrapper[4975]: I0318 12:12:42.015632 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:42 crc kubenswrapper[4975]: I0318 12:12:42.015601 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:42 crc kubenswrapper[4975]: E0318 12:12:42.015739 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:42 crc kubenswrapper[4975]: I0318 12:12:42.015821 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:42 crc kubenswrapper[4975]: E0318 12:12:42.015830 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:42 crc kubenswrapper[4975]: E0318 12:12:42.016580 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:42 crc kubenswrapper[4975]: E0318 12:12:42.016786 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.016041 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.016088 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.016110 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:44 crc kubenswrapper[4975]: E0318 12:12:44.016337 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:44 crc kubenswrapper[4975]: E0318 12:12:44.016381 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:44 crc kubenswrapper[4975]: E0318 12:12:44.016190 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.016637 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:44 crc kubenswrapper[4975]: E0318 12:12:44.016707 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.694743 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9j7f_add6c8de-77cd-42e7-bf06-d2333b9392ea/kube-multus/0.log" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.694827 4975 generic.go:334] "Generic (PLEG): container finished" podID="add6c8de-77cd-42e7-bf06-d2333b9392ea" containerID="484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543" exitCode=1 Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.694899 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9j7f" event={"ID":"add6c8de-77cd-42e7-bf06-d2333b9392ea","Type":"ContainerDied","Data":"484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543"} Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.695394 4975 scope.go:117] "RemoveContainer" containerID="484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.718993 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.736888 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56efd9a4-4479-4621-b2d4-572ba3a4e7cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6baed74837a87ab6340d3036ba532bab8aabfe3c751e3af2a483308255eb1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:10:17.099277 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:10:17.101321 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:10:17.148684 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:10:17.153358 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 12:10:47.611677 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a28602986a572f19d3fe21e7033ff58efe255822d5bfd421c86004edb5693d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fffbb26bf332dce194e9fe134fc0e9eccf21c6c833e26e4c7aaaed7d211ae0cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.750895 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.763332 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.776495 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.790076 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.804191 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.818031 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.828029 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.842115 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.856920 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.866369 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.879777 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.891669 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.909860 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007508b07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-apiserver-operator,},ClusterIP:10.217.4.38,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.38],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0318 12:12:20.859074 7184 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.920985 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.930347 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.944717 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:43Z\\\",\\\"message\\\":\\\"2026-03-18T12:11:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43a459fa-bd26-4653-b2fd-f44308c1a7fb\\\\n2026-03-18T12:11:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43a459fa-bd26-4653-b2fd-f44308c1a7fb to /host/opt/cni/bin/\\\\n2026-03-18T12:11:58Z [verbose] multus-daemon started\\\\n2026-03-18T12:11:58Z [verbose] Readiness Indicator file check\\\\n2026-03-18T12:12:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:44 crc kubenswrapper[4975]: I0318 12:12:44.955122 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:44Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.028200 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.039846 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.052069 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.064397 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.083514 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.094709 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.106405 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.118789 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: E0318 12:12:45.128193 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.130229 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.154948 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007508b07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-apiserver-operator,},ClusterIP:10.217.4.38,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.38],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0318 12:12:20.859074 7184 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.169299 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.180491 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.193877 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:43Z\\\",\\\"message\\\":\\\"2026-03-18T12:11:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43a459fa-bd26-4653-b2fd-f44308c1a7fb\\\\n2026-03-18T12:11:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43a459fa-bd26-4653-b2fd-f44308c1a7fb to /host/opt/cni/bin/\\\\n2026-03-18T12:11:58Z [verbose] multus-daemon started\\\\n2026-03-18T12:11:58Z [verbose] Readiness Indicator file check\\\\n2026-03-18T12:12:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.203939 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.218986 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.241082 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.253158 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56efd9a4-4479-4621-b2d4-572ba3a4e7cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6baed74837a87ab6340d3036ba532bab8aabfe3c751e3af2a483308255eb1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:10:17.099277 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:10:17.101321 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:10:17.148684 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:10:17.153358 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 12:10:47.611677 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a28602986a572f19d3fe21e7033ff58efe255822d5bfd421c86004edb5693d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fffbb26bf332dce194e9fe134fc0e9eccf21c6c833e26e4c7aaaed7d211ae0cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.264582 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.279053 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.700387 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9j7f_add6c8de-77cd-42e7-bf06-d2333b9392ea/kube-multus/0.log" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.700440 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9j7f" event={"ID":"add6c8de-77cd-42e7-bf06-d2333b9392ea","Type":"ContainerStarted","Data":"d611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d"} Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.711069 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.722672 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.736202 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.749809 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:43Z\\\",\\\"message\\\":\\\"2026-03-18T12:11:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43a459fa-bd26-4653-b2fd-f44308c1a7fb\\\\n2026-03-18T12:11:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43a459fa-bd26-4653-b2fd-f44308c1a7fb to /host/opt/cni/bin/\\\\n2026-03-18T12:11:58Z [verbose] multus-daemon started\\\\n2026-03-18T12:11:58Z [verbose] Readiness Indicator file check\\\\n2026-03-18T12:12:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.763453 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.779695 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.797877 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.810368 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56efd9a4-4479-4621-b2d4-572ba3a4e7cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6baed74837a87ab6340d3036ba532bab8aabfe3c751e3af2a483308255eb1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:10:17.099277 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:10:17.101321 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:10:17.148684 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:10:17.153358 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 12:10:47.611677 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a28602986a572f19d3fe21e7033ff58efe255822d5bfd421c86004edb5693d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fffbb26bf332dce194e9fe134fc0e9eccf21c6c833e26e4c7aaaed7d211ae0cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.821450 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.829984 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.839856 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.850799 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.864096 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.877196 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.890069 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.910388 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007508b07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-apiserver-operator,},ClusterIP:10.217.4.38,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.38],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0318 12:12:20.859074 7184 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.921169 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.932704 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:45 crc kubenswrapper[4975]: I0318 12:12:45.942783 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:45Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.054474 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.054543 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:46 crc kubenswrapper[4975]: E0318 12:12:46.054605 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:46 crc kubenswrapper[4975]: E0318 12:12:46.054665 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.054735 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:46 crc kubenswrapper[4975]: E0318 12:12:46.054785 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.054831 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:46 crc kubenswrapper[4975]: E0318 12:12:46.054903 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.055628 4975 scope.go:117] "RemoveContainer" containerID="1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.705228 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovnkube-controller/2.log" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.707668 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerStarted","Data":"d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1"} Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.708647 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.729727 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.744233 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56efd9a4-4479-4621-b2d4-572ba3a4e7cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6baed74837a87ab6340d3036ba532bab8aabfe3c751e3af2a483308255eb1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:10:17.099277 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:10:17.101321 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:10:17.148684 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:10:17.153358 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 12:10:47.611677 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a28602986a572f19d3fe21e7033ff58efe255822d5bfd421c86004edb5693d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fffbb26bf332dce194e9fe134fc0e9eccf21c6c833e26e4c7aaaed7d211ae0cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.755188 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.766533 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.780379 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.793950 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.805757 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.816559 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.825739 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.834105 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.844097 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.856309 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.866244 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.876200 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.896240 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007508b07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-apiserver-operator,},ClusterIP:10.217.4.38,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.38],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0318 12:12:20.859074 7184 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.910128 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.921853 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.935261 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:43Z\\\",\\\"message\\\":\\\"2026-03-18T12:11:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43a459fa-bd26-4653-b2fd-f44308c1a7fb\\\\n2026-03-18T12:11:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43a459fa-bd26-4653-b2fd-f44308c1a7fb to /host/opt/cni/bin/\\\\n2026-03-18T12:11:58Z [verbose] multus-daemon started\\\\n2026-03-18T12:11:58Z [verbose] Readiness Indicator file check\\\\n2026-03-18T12:12:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:46 crc kubenswrapper[4975]: I0318 12:12:46.945812 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:46Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.021470 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.021509 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.021517 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.021531 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.021540 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:47Z","lastTransitionTime":"2026-03-18T12:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:47 crc kubenswrapper[4975]: E0318 12:12:47.032847 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.036466 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.036504 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.036514 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.036530 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.036541 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:47Z","lastTransitionTime":"2026-03-18T12:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:47 crc kubenswrapper[4975]: E0318 12:12:47.049769 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.053108 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.053144 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.053154 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.053169 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.053179 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:47Z","lastTransitionTime":"2026-03-18T12:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:47 crc kubenswrapper[4975]: E0318 12:12:47.066254 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.069276 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.069302 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.069310 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.069324 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.069335 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:47Z","lastTransitionTime":"2026-03-18T12:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:47 crc kubenswrapper[4975]: E0318 12:12:47.079651 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.082976 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.083002 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.083010 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.083023 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.083032 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:47Z","lastTransitionTime":"2026-03-18T12:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:47 crc kubenswrapper[4975]: E0318 12:12:47.093744 4975 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7281f64d-d9d9-472a-a299-3ee193dcc38d\\\",\\\"systemUUID\\\":\\\"885d92fc-8d43-4f95-a548-3a5e1645d68d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: E0318 12:12:47.093949 4975 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.713435 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovnkube-controller/3.log" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.714262 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovnkube-controller/2.log" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.716907 4975 generic.go:334] "Generic (PLEG): container finished" podID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerID="d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1" exitCode=1 Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.717006 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerDied","Data":"d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1"} Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.717210 4975 scope.go:117] "RemoveContainer" containerID="1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.717981 4975 scope.go:117] "RemoveContainer" containerID="d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1" Mar 18 12:12:47 crc kubenswrapper[4975]: E0318 12:12:47.718213 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.734884 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.751693 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.766685 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.799117 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c4f3187e4c9dccc77ecfa5e027a8979df0bfb48e7cce4d47dcbfda0c474a474\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:21Z\\\",\\\"message\\\":\\\"1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007508b07 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-apiserver-operator,},ClusterIP:10.217.4.38,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.38],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0318 12:12:20.859074 7184 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:46Z\\\",\\\"message\\\":\\\"hift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:12:46.852095 7515 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:46.848928 7515 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:12:46.852282 7515 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:46.852328 7515 factory.go:656] Stopping watch factory\\\\nI0318 12:12:46.852415 7515 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:46.852545 7515 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:46.887182 7515 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0318 12:12:46.887221 7515 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0318 12:12:46.887317 7515 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:12:46.887344 7515 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 12:12:46.887444 7515 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.818453 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.833160 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.847453 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:43Z\\\",\\\"message\\\":\\\"2026-03-18T12:11:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43a459fa-bd26-4653-b2fd-f44308c1a7fb\\\\n2026-03-18T12:11:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43a459fa-bd26-4653-b2fd-f44308c1a7fb to /host/opt/cni/bin/\\\\n2026-03-18T12:11:58Z [verbose] multus-daemon started\\\\n2026-03-18T12:11:58Z [verbose] Readiness Indicator file check\\\\n2026-03-18T12:12:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.859161 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.876106 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.903083 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.915820 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56efd9a4-4479-4621-b2d4-572ba3a4e7cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6baed74837a87ab6340d3036ba532bab8aabfe3c751e3af2a483308255eb1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:10:17.099277 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:10:17.101321 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:10:17.148684 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:10:17.153358 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 12:10:47.611677 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a28602986a572f19d3fe21e7033ff58efe255822d5bfd421c86004edb5693d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fffbb26bf332dce194e9fe134fc0e9eccf21c6c833e26e4c7aaaed7d211ae0cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.929609 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.944709 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.960915 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.977168 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:47 crc kubenswrapper[4975]: I0318 12:12:47.992576 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:47Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.006361 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.015992 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.015992 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.016005 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.016239 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:48 crc kubenswrapper[4975]: E0318 12:12:48.016336 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:48 crc kubenswrapper[4975]: E0318 12:12:48.016261 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:48 crc kubenswrapper[4975]: E0318 12:12:48.016530 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:48 crc kubenswrapper[4975]: E0318 12:12:48.016541 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.017293 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.026631 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.722016 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovnkube-controller/3.log" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.727201 4975 scope.go:117] "RemoveContainer" containerID="d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1" Mar 18 12:12:48 crc kubenswrapper[4975]: E0318 12:12:48.727472 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.740006 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.753376 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.765502 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.776488 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.787039 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.796552 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.808188 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.822749 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.834238 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.853378 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:46Z\\\",\\\"message\\\":\\\"hift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:12:46.852095 7515 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:46.848928 7515 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:12:46.852282 7515 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:46.852328 7515 factory.go:656] Stopping watch factory\\\\nI0318 12:12:46.852415 7515 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:46.852545 7515 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:46.887182 7515 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0318 12:12:46.887221 7515 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0318 12:12:46.887317 7515 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:12:46.887344 7515 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 12:12:46.887444 7515 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.865787 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.877475 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.892498 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:43Z\\\",\\\"message\\\":\\\"2026-03-18T12:11:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43a459fa-bd26-4653-b2fd-f44308c1a7fb\\\\n2026-03-18T12:11:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43a459fa-bd26-4653-b2fd-f44308c1a7fb to /host/opt/cni/bin/\\\\n2026-03-18T12:11:58Z [verbose] multus-daemon started\\\\n2026-03-18T12:11:58Z [verbose] Readiness Indicator file check\\\\n2026-03-18T12:12:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.906272 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.926948 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.941835 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56efd9a4-4479-4621-b2d4-572ba3a4e7cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6baed74837a87ab6340d3036ba532bab8aabfe3c751e3af2a483308255eb1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:10:17.099277 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:10:17.101321 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:10:17.148684 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:10:17.153358 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 12:10:47.611677 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a28602986a572f19d3fe21e7033ff58efe255822d5bfd421c86004edb5693d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fffbb26bf332dce194e9fe134fc0e9eccf21c6c833e26e4c7aaaed7d211ae0cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.954590 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.966551 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:48 crc kubenswrapper[4975]: I0318 12:12:48.981033 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:48Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:50 crc kubenswrapper[4975]: I0318 12:12:50.015994 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:50 crc kubenswrapper[4975]: E0318 12:12:50.016579 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:50 crc kubenswrapper[4975]: I0318 12:12:50.016031 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:50 crc kubenswrapper[4975]: E0318 12:12:50.016781 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:50 crc kubenswrapper[4975]: I0318 12:12:50.016003 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:50 crc kubenswrapper[4975]: E0318 12:12:50.017037 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:50 crc kubenswrapper[4975]: I0318 12:12:50.016117 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:50 crc kubenswrapper[4975]: E0318 12:12:50.017210 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:50 crc kubenswrapper[4975]: E0318 12:12:50.129742 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:52 crc kubenswrapper[4975]: I0318 12:12:52.015579 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:52 crc kubenswrapper[4975]: I0318 12:12:52.015711 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:52 crc kubenswrapper[4975]: E0318 12:12:52.015804 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:52 crc kubenswrapper[4975]: I0318 12:12:52.015651 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:52 crc kubenswrapper[4975]: I0318 12:12:52.015670 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:52 crc kubenswrapper[4975]: E0318 12:12:52.015936 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:52 crc kubenswrapper[4975]: E0318 12:12:52.016079 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:52 crc kubenswrapper[4975]: E0318 12:12:52.016151 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:54 crc kubenswrapper[4975]: I0318 12:12:54.015732 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:54 crc kubenswrapper[4975]: I0318 12:12:54.015792 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:54 crc kubenswrapper[4975]: I0318 12:12:54.015764 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:54 crc kubenswrapper[4975]: E0318 12:12:54.015913 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:54 crc kubenswrapper[4975]: I0318 12:12:54.015948 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:54 crc kubenswrapper[4975]: E0318 12:12:54.015983 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:54 crc kubenswrapper[4975]: E0318 12:12:54.016049 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:54 crc kubenswrapper[4975]: E0318 12:12:54.016100 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.031613 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea15fad9-b279-4a0b-8d36-277ed56d8e30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e990b37ec147dca709852c241e819920061b2ef1814cf7d5d5c44600dce87ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b94d7856e995376bb6394f6fbafa820cc076ff9f6fc5dc599df4928c75f3163\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.052621 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.067735 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59dd8f35-75c5-42d7-b11a-06586d1d5a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db7cb5f97f525c6150e71f8642df5f9d7d0db3497cf7af83374da2d7adbc0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97fm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvdzt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.089530 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d0be67-e739-4dd7-abe4-3986a330a037\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:46Z\\\",\\\"message\\\":\\\"hift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:12:46.852095 7515 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:46.848928 7515 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0318 12:12:46.852282 7515 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:46.852328 7515 factory.go:656] Stopping watch factory\\\\nI0318 12:12:46.852415 7515 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:46.852545 7515 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0318 12:12:46.887182 7515 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0318 12:12:46.887221 7515 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0318 12:12:46.887317 7515 ovnkube.go:599] Stopped ovnkube\\\\nI0318 12:12:46.887344 7515 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0318 12:12:46.887444 7515 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-766zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k8v6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.105988 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.120404 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: E0318 12:12:55.131016 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.134384 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9j7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"add6c8de-77cd-42e7-bf06-d2333b9392ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-18T12:12:43Z\\\",\\\"message\\\":\\\"2026-03-18T12:11:58+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_43a459fa-bd26-4653-b2fd-f44308c1a7fb\\\\n2026-03-18T12:11:58+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_43a459fa-bd26-4653-b2fd-f44308c1a7fb to /host/opt/cni/bin/\\\\n2026-03-18T12:11:58Z [verbose] multus-daemon started\\\\n2026-03-18T12:11:58Z [verbose] Readiness Indicator file check\\\\n2026-03-18T12:12:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zzgnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9j7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.147832 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-587nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fbv27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-587nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.166465 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4190fd88-9553-455a-946f-f59b234d14ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd9b2e75358644dabfcde6b9b405a88b55c83e5dc293a6512bbed91df64ee3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431089f070bf2950025cc9bb52ecb99c95c13abe15f2ef1c79cab83435258487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83e9b5d5c41e3ba822051f6c01b893ac723dea25249e30e4ff16c1cdf5041b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58a8741ee636b9f049a67277ac753e10f73b730d0e0efc2c6ef6da44cd841018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66c057b7efe1b7c8b06de42d655bfe58782233d5692ee0c39eef92bb8dcd3c16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea9f176908377df1b2f1f69584e4642517ef1591e415f16156e456e8651ca4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e71265991468f7f022be69e0e6a3665877f10d0b703247d769c2fa782f5214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c243d610023dcc6af7905a3f5ca5cc088144db31e29e61b4d269a126086caba7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.178809 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56efd9a4-4479-4621-b2d4-572ba3a4e7cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6baed74837a87ab6340d3036ba532bab8aabfe3c751e3af2a483308255eb1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd74d0ae78901f57f45ddede99b5b55165d3d2a7103eef8371891521fd6b3c46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:10:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0318 12:10:17.099277 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0318 12:10:17.101321 1 observer_polling.go:159] Starting file observer\\\\nI0318 12:10:17.148684 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0318 12:10:17.153358 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0318 12:10:47.611677 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00a28602986a572f19d3fe21e7033ff58efe255822d5bfd421c86004edb5693d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fffbb26bf332dce194e9fe134fc0e9eccf21c6c833e26e4c7aaaed7d211ae0cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.189733 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4279db4a-0510-49ca-b9f2-bf035e5209e2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e60b84ddfc2cd2530a7ff0385ea9e95dfadb45a06ba9d98bf350f123b283d26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6770c00fe0f8e7a8a33f0907f2d5a44805722dae64edc5a03193fd785cb1bb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1660cc319374fc5a436bc3744321fc3053f5325f451ac74d27bd35a3f1e2e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7fa3d0c3d82b6bdedf1af33553f4541ea499b8c907198d6081fba6b41523df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.201894 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df3eab9954031b418967c2e0bb2fad5667be68fc691d95d79a8e877d2971f7ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0c4f477108bcac9041ffaf6c4211dbd5d4eba989a819f5df6efcfe537093c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.220676 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60e8a8fd-753d-433d-acf1-fa78d1cc2184\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df8aae3de55433120292275b85026b54b486e56da8826b4ef9d5d6e01f7d3c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54729c276d957d263ff5fa0ac84d8ec1ed8c31162a22997d8191e8fa11e7531d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca9799c9d041e80d6be6e9eea83e23256b2b8b803097dd5f7631d9d990703757\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf11c899f73d90478062fc9c9129ab8ab464bfebc906a6c12bffe85d05251609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2715592a82973a337c0c3e4c523143e678a138fc2cb4b74b46864816602dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ffd5a0fe5b05ffe896fccaf7763e2246b1a55d7663274cc64a3bd9e32a7e16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980094204b7647ea6f4f170abbf199737773f2ba083a60532084d968c803e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntpvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwgkw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.236401 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c374ff0-a569-44d6-a341-482a4cf71b70\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:10:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-18T12:11:25Z\\\",\\\"message\\\":\\\"file observer\\\\nW0318 12:11:25.023388 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0318 12:11:25.023522 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0318 12:11:25.024316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-151379793/tls.crt::/tmp/serving-cert-151379793/tls.key\\\\\\\"\\\\nI0318 12:11:25.411854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0318 12:11:25.417554 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0318 12:11:25.417606 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0318 12:11:25.417634 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0318 12:11:25.417639 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0318 12:11:25.422745 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0318 12:11:25.422786 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422792 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0318 12:11:25.422797 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0318 12:11:25.422801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0318 12:11:25.422805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0318 12:11:25.422808 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0318 12:11:25.422759 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0318 12:11:25.425178 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-18T12:11:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:10:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-18T12:10:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-18T12:10:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:10:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.251621 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d9a6d6e451f5985fbb31f4772c1339d63f4dfe8b04cc3f241db4a0aacbc2c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.264461 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf9f6aad610ec81d39d8fe0ef795334d882ccf04a8520e13c94a3d21faf7d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.275143 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5kbr4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06dd5a0b-b3ef-4f11-83f7-7e48c2a119ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:11:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://522137cf3146be8653747c24abe8890d46a8c9113b2b9ce8e3b7c9d2cc027f4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:11:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrc82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:11:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5kbr4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.285567 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ksjpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"380bec4c-fbbe-461f-9a80-56472145eca1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79ec7cc698fbecd370d85c2a810f9be1130b088902c907f3ea808daf0a66fd9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmvr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ksjpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:55 crc kubenswrapper[4975]: I0318 12:12:55.296530 4975 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af4c18e-c13c-45f8-b7cb-9ccfaa845a93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T12:12:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8a5c5b885305b00496b54f4c8176777b42f17b6773f8b16c2bf6fc923b29cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b58da26549359a89879871868c713dc68e7a40393d43324424dfa14ba1755ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T12:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rlfm7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T12:12:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7tcxk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T12:12:55Z is after 2025-08-24T17:21:41Z" Mar 18 12:12:56 crc kubenswrapper[4975]: I0318 12:12:56.047679 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:56 crc kubenswrapper[4975]: I0318 12:12:56.047705 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:56 crc kubenswrapper[4975]: I0318 12:12:56.047731 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:56 crc kubenswrapper[4975]: I0318 12:12:56.047855 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:56 crc kubenswrapper[4975]: E0318 12:12:56.048085 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:56 crc kubenswrapper[4975]: E0318 12:12:56.048209 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:56 crc kubenswrapper[4975]: E0318 12:12:56.048311 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:56 crc kubenswrapper[4975]: E0318 12:12:56.048448 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.108369 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.108417 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.108428 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.108443 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.108454 4975 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T12:12:57Z","lastTransitionTime":"2026-03-18T12:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.173674 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5"] Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.174031 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.177597 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.177715 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.177827 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.177732 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.202086 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=86.202065412 podStartE2EDuration="1m26.202065412s" podCreationTimestamp="2026-03-18 12:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:57.192349737 +0000 UTC m=+162.906750316" watchObservedRunningTime="2026-03-18 12:12:57.202065412 +0000 UTC m=+162.916465991" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.250408 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podStartSLOduration=114.250388372 podStartE2EDuration="1m54.250388372s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:57.250001392 +0000 UTC m=+162.964401971" watchObservedRunningTime="2026-03-18 12:12:57.250388372 +0000 UTC m=+162.964788951" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.333716 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-n9j7f" podStartSLOduration=114.333697316 podStartE2EDuration="1m54.333697316s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:57.333534962 +0000 UTC m=+163.047935541" watchObservedRunningTime="2026-03-18 12:12:57.333697316 +0000 UTC m=+163.048097895" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.367269 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d10b5f90-4424-4bc9-b7e5-34bbb499b4c9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6nhw5\" (UID: \"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.367314 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d10b5f90-4424-4bc9-b7e5-34bbb499b4c9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6nhw5\" (UID: \"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.367344 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d10b5f90-4424-4bc9-b7e5-34bbb499b4c9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6nhw5\" (UID: \"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.367381 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d10b5f90-4424-4bc9-b7e5-34bbb499b4c9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6nhw5\" (UID: \"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.367402 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d10b5f90-4424-4bc9-b7e5-34bbb499b4c9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6nhw5\" (UID: \"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.368187 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=79.368174296 podStartE2EDuration="1m19.368174296s" podCreationTimestamp="2026-03-18 12:11:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:57.36791927 +0000 UTC m=+163.082319859" watchObservedRunningTime="2026-03-18 12:12:57.368174296 +0000 UTC m=+163.082574875" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.380596 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=38.3805818 podStartE2EDuration="38.3805818s" podCreationTimestamp="2026-03-18 12:12:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:57.380249191 +0000 UTC m=+163.094649790" watchObservedRunningTime="2026-03-18 12:12:57.3805818 +0000 UTC m=+163.094982369" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.390412 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=51.390394797 podStartE2EDuration="51.390394797s" podCreationTimestamp="2026-03-18 12:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:57.389967097 +0000 UTC m=+163.104367696" watchObservedRunningTime="2026-03-18 12:12:57.390394797 +0000 UTC m=+163.104795376" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.428519 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=87.42849919 podStartE2EDuration="1m27.42849919s" podCreationTimestamp="2026-03-18 12:11:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:57.428320745 +0000 UTC m=+163.142721324" watchObservedRunningTime="2026-03-18 12:12:57.42849919 +0000 UTC m=+163.142899769" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.428887 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vwgkw" podStartSLOduration=114.428882249 podStartE2EDuration="1m54.428882249s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:57.415536962 +0000 UTC m=+163.129937541" watchObservedRunningTime="2026-03-18 12:12:57.428882249 +0000 UTC m=+163.143282828" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.460482 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5kbr4" podStartSLOduration=114.460468767 podStartE2EDuration="1m54.460468767s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:57.459905942 +0000 UTC m=+163.174306531" watchObservedRunningTime="2026-03-18 12:12:57.460468767 +0000 UTC m=+163.174869346" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.468368 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d10b5f90-4424-4bc9-b7e5-34bbb499b4c9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6nhw5\" (UID: \"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.468415 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d10b5f90-4424-4bc9-b7e5-34bbb499b4c9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6nhw5\" (UID: \"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.468440 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d10b5f90-4424-4bc9-b7e5-34bbb499b4c9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6nhw5\" (UID: \"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.468477 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d10b5f90-4424-4bc9-b7e5-34bbb499b4c9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6nhw5\" (UID: \"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.468497 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d10b5f90-4424-4bc9-b7e5-34bbb499b4c9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6nhw5\" (UID: \"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.468546 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d10b5f90-4424-4bc9-b7e5-34bbb499b4c9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6nhw5\" (UID: \"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.468577 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d10b5f90-4424-4bc9-b7e5-34bbb499b4c9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6nhw5\" (UID: \"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.469509 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d10b5f90-4424-4bc9-b7e5-34bbb499b4c9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6nhw5\" (UID: \"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.477448 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d10b5f90-4424-4bc9-b7e5-34bbb499b4c9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6nhw5\" (UID: \"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.485275 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d10b5f90-4424-4bc9-b7e5-34bbb499b4c9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6nhw5\" (UID: \"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.485477 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ksjpq" podStartSLOduration=114.485290613 podStartE2EDuration="1m54.485290613s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:57.471398773 +0000 UTC m=+163.185799372" watchObservedRunningTime="2026-03-18 12:12:57.485290613 +0000 UTC m=+163.199691192" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.485628 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7tcxk" podStartSLOduration=113.485623252 podStartE2EDuration="1m53.485623252s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:57.484895703 +0000 UTC m=+163.199296302" watchObservedRunningTime="2026-03-18 12:12:57.485623252 +0000 UTC m=+163.200023841" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.492367 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.588978 4975 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.596153 4975 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.755184 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" event={"ID":"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9","Type":"ContainerStarted","Data":"747cc2f59885a74bac4c05c90798ba9cb284ba75a905e91b84e3731528376626"} Mar 18 12:12:57 crc kubenswrapper[4975]: I0318 12:12:57.755248 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" event={"ID":"d10b5f90-4424-4bc9-b7e5-34bbb499b4c9","Type":"ContainerStarted","Data":"9ba0beeb99eef18e37fddde28a800928ea3ac912c651a6c4e867d28e0837306b"} Mar 18 12:12:58 crc kubenswrapper[4975]: I0318 12:12:58.016185 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:12:58 crc kubenswrapper[4975]: I0318 12:12:58.016294 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:12:58 crc kubenswrapper[4975]: E0318 12:12:58.016316 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:12:58 crc kubenswrapper[4975]: I0318 12:12:58.016184 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:12:58 crc kubenswrapper[4975]: E0318 12:12:58.016425 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:12:58 crc kubenswrapper[4975]: I0318 12:12:58.016205 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:12:58 crc kubenswrapper[4975]: E0318 12:12:58.016580 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:12:58 crc kubenswrapper[4975]: E0318 12:12:58.016668 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:00 crc kubenswrapper[4975]: I0318 12:13:00.016262 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:00 crc kubenswrapper[4975]: E0318 12:13:00.016737 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:00 crc kubenswrapper[4975]: I0318 12:13:00.016360 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:00 crc kubenswrapper[4975]: E0318 12:13:00.016853 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:00 crc kubenswrapper[4975]: I0318 12:13:00.016423 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:00 crc kubenswrapper[4975]: E0318 12:13:00.016961 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:00 crc kubenswrapper[4975]: I0318 12:13:00.016306 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:00 crc kubenswrapper[4975]: E0318 12:13:00.017036 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:00 crc kubenswrapper[4975]: E0318 12:13:00.132343 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:13:02 crc kubenswrapper[4975]: I0318 12:13:02.016046 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:02 crc kubenswrapper[4975]: I0318 12:13:02.016048 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:02 crc kubenswrapper[4975]: I0318 12:13:02.016058 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:02 crc kubenswrapper[4975]: I0318 12:13:02.016125 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:02 crc kubenswrapper[4975]: E0318 12:13:02.016254 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:02 crc kubenswrapper[4975]: E0318 12:13:02.016340 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:02 crc kubenswrapper[4975]: E0318 12:13:02.016412 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:02 crc kubenswrapper[4975]: E0318 12:13:02.016485 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:03 crc kubenswrapper[4975]: I0318 12:13:03.018154 4975 scope.go:117] "RemoveContainer" containerID="d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1" Mar 18 12:13:03 crc kubenswrapper[4975]: E0318 12:13:03.018440 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" Mar 18 12:13:04 crc kubenswrapper[4975]: I0318 12:13:04.016248 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:04 crc kubenswrapper[4975]: I0318 12:13:04.016314 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:04 crc kubenswrapper[4975]: I0318 12:13:04.016293 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:04 crc kubenswrapper[4975]: I0318 12:13:04.016257 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:04 crc kubenswrapper[4975]: E0318 12:13:04.016487 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:04 crc kubenswrapper[4975]: E0318 12:13:04.016727 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:04 crc kubenswrapper[4975]: E0318 12:13:04.016846 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:04 crc kubenswrapper[4975]: E0318 12:13:04.016978 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:05 crc kubenswrapper[4975]: E0318 12:13:05.132966 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:13:06 crc kubenswrapper[4975]: I0318 12:13:06.016433 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:06 crc kubenswrapper[4975]: I0318 12:13:06.016564 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:06 crc kubenswrapper[4975]: I0318 12:13:06.016601 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:06 crc kubenswrapper[4975]: I0318 12:13:06.016628 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:06 crc kubenswrapper[4975]: E0318 12:13:06.017362 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:06 crc kubenswrapper[4975]: E0318 12:13:06.017596 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:06 crc kubenswrapper[4975]: E0318 12:13:06.017707 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:06 crc kubenswrapper[4975]: E0318 12:13:06.017766 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:08 crc kubenswrapper[4975]: I0318 12:13:08.016276 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:08 crc kubenswrapper[4975]: E0318 12:13:08.017022 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:08 crc kubenswrapper[4975]: I0318 12:13:08.016455 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:08 crc kubenswrapper[4975]: E0318 12:13:08.017094 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:08 crc kubenswrapper[4975]: I0318 12:13:08.016482 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:08 crc kubenswrapper[4975]: E0318 12:13:08.017141 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:08 crc kubenswrapper[4975]: I0318 12:13:08.016410 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:08 crc kubenswrapper[4975]: E0318 12:13:08.017185 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:10 crc kubenswrapper[4975]: I0318 12:13:10.016316 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:10 crc kubenswrapper[4975]: I0318 12:13:10.016405 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:10 crc kubenswrapper[4975]: I0318 12:13:10.016417 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:10 crc kubenswrapper[4975]: I0318 12:13:10.017018 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:10 crc kubenswrapper[4975]: E0318 12:13:10.017132 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:10 crc kubenswrapper[4975]: E0318 12:13:10.016549 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:10 crc kubenswrapper[4975]: E0318 12:13:10.016623 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:10 crc kubenswrapper[4975]: E0318 12:13:10.016476 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:10 crc kubenswrapper[4975]: E0318 12:13:10.134197 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:13:12 crc kubenswrapper[4975]: I0318 12:13:12.015966 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:12 crc kubenswrapper[4975]: E0318 12:13:12.016107 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:12 crc kubenswrapper[4975]: I0318 12:13:12.015991 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:12 crc kubenswrapper[4975]: E0318 12:13:12.016195 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:12 crc kubenswrapper[4975]: I0318 12:13:12.015987 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:12 crc kubenswrapper[4975]: E0318 12:13:12.016250 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:12 crc kubenswrapper[4975]: I0318 12:13:12.015966 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:12 crc kubenswrapper[4975]: E0318 12:13:12.016293 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:12 crc kubenswrapper[4975]: I0318 12:13:12.310106 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs\") pod \"network-metrics-daemon-587nk\" (UID: \"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\") " pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:12 crc kubenswrapper[4975]: E0318 12:13:12.310416 4975 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:13:12 crc kubenswrapper[4975]: E0318 12:13:12.310584 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs podName:a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b nodeName:}" failed. No retries permitted until 2026-03-18 12:14:16.310545931 +0000 UTC m=+242.024946710 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs") pod "network-metrics-daemon-587nk" (UID: "a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 12:13:14 crc kubenswrapper[4975]: I0318 12:13:14.016420 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:14 crc kubenswrapper[4975]: E0318 12:13:14.016555 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:14 crc kubenswrapper[4975]: I0318 12:13:14.016603 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:14 crc kubenswrapper[4975]: I0318 12:13:14.016630 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:14 crc kubenswrapper[4975]: E0318 12:13:14.016905 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:14 crc kubenswrapper[4975]: I0318 12:13:14.017166 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:14 crc kubenswrapper[4975]: E0318 12:13:14.017277 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:14 crc kubenswrapper[4975]: E0318 12:13:14.017351 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:15 crc kubenswrapper[4975]: E0318 12:13:15.135034 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:13:16 crc kubenswrapper[4975]: I0318 12:13:16.016486 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:16 crc kubenswrapper[4975]: E0318 12:13:16.016707 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:16 crc kubenswrapper[4975]: I0318 12:13:16.017157 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:16 crc kubenswrapper[4975]: E0318 12:13:16.017298 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:16 crc kubenswrapper[4975]: I0318 12:13:16.017520 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:16 crc kubenswrapper[4975]: I0318 12:13:16.017594 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:16 crc kubenswrapper[4975]: E0318 12:13:16.017768 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:16 crc kubenswrapper[4975]: E0318 12:13:16.018054 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:17 crc kubenswrapper[4975]: I0318 12:13:17.016594 4975 scope.go:117] "RemoveContainer" containerID="d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1" Mar 18 12:13:17 crc kubenswrapper[4975]: E0318 12:13:17.016923 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-k8v6h_openshift-ovn-kubernetes(b0d0be67-e739-4dd7-abe4-3986a330a037)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" Mar 18 12:13:18 crc kubenswrapper[4975]: I0318 12:13:18.015715 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:18 crc kubenswrapper[4975]: I0318 12:13:18.015738 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:18 crc kubenswrapper[4975]: I0318 12:13:18.015740 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:18 crc kubenswrapper[4975]: E0318 12:13:18.015997 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:18 crc kubenswrapper[4975]: E0318 12:13:18.015838 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:18 crc kubenswrapper[4975]: E0318 12:13:18.016119 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:18 crc kubenswrapper[4975]: I0318 12:13:18.016363 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:18 crc kubenswrapper[4975]: E0318 12:13:18.016581 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:20 crc kubenswrapper[4975]: I0318 12:13:20.015345 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:20 crc kubenswrapper[4975]: E0318 12:13:20.015465 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:20 crc kubenswrapper[4975]: I0318 12:13:20.015535 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:20 crc kubenswrapper[4975]: I0318 12:13:20.015669 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:20 crc kubenswrapper[4975]: E0318 12:13:20.015699 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:20 crc kubenswrapper[4975]: E0318 12:13:20.015888 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:20 crc kubenswrapper[4975]: I0318 12:13:20.016051 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:20 crc kubenswrapper[4975]: E0318 12:13:20.016216 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:20 crc kubenswrapper[4975]: E0318 12:13:20.135880 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:13:22 crc kubenswrapper[4975]: I0318 12:13:22.015395 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:22 crc kubenswrapper[4975]: I0318 12:13:22.015466 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:22 crc kubenswrapper[4975]: I0318 12:13:22.015492 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:22 crc kubenswrapper[4975]: E0318 12:13:22.015541 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:22 crc kubenswrapper[4975]: I0318 12:13:22.015428 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:22 crc kubenswrapper[4975]: E0318 12:13:22.015758 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:22 crc kubenswrapper[4975]: E0318 12:13:22.015804 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:22 crc kubenswrapper[4975]: E0318 12:13:22.015932 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:24 crc kubenswrapper[4975]: I0318 12:13:24.015939 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:24 crc kubenswrapper[4975]: I0318 12:13:24.015964 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:24 crc kubenswrapper[4975]: I0318 12:13:24.015995 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:24 crc kubenswrapper[4975]: I0318 12:13:24.015939 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:24 crc kubenswrapper[4975]: E0318 12:13:24.016085 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:24 crc kubenswrapper[4975]: E0318 12:13:24.016186 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:24 crc kubenswrapper[4975]: E0318 12:13:24.016252 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:24 crc kubenswrapper[4975]: E0318 12:13:24.016289 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:25 crc kubenswrapper[4975]: E0318 12:13:25.136441 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:13:26 crc kubenswrapper[4975]: I0318 12:13:26.016126 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:26 crc kubenswrapper[4975]: E0318 12:13:26.016265 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:26 crc kubenswrapper[4975]: I0318 12:13:26.016467 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:26 crc kubenswrapper[4975]: E0318 12:13:26.016527 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:26 crc kubenswrapper[4975]: I0318 12:13:26.016654 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:26 crc kubenswrapper[4975]: E0318 12:13:26.016719 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:26 crc kubenswrapper[4975]: I0318 12:13:26.017297 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:26 crc kubenswrapper[4975]: E0318 12:13:26.017357 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:28 crc kubenswrapper[4975]: I0318 12:13:28.015898 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:28 crc kubenswrapper[4975]: I0318 12:13:28.015929 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:28 crc kubenswrapper[4975]: E0318 12:13:28.016052 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:28 crc kubenswrapper[4975]: I0318 12:13:28.016108 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:28 crc kubenswrapper[4975]: I0318 12:13:28.016199 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:28 crc kubenswrapper[4975]: E0318 12:13:28.016295 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:28 crc kubenswrapper[4975]: E0318 12:13:28.016424 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:28 crc kubenswrapper[4975]: E0318 12:13:28.016514 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:30 crc kubenswrapper[4975]: I0318 12:13:30.016385 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:30 crc kubenswrapper[4975]: I0318 12:13:30.016391 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:30 crc kubenswrapper[4975]: I0318 12:13:30.016388 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:30 crc kubenswrapper[4975]: I0318 12:13:30.016413 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:30 crc kubenswrapper[4975]: E0318 12:13:30.016537 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:30 crc kubenswrapper[4975]: E0318 12:13:30.016768 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:30 crc kubenswrapper[4975]: E0318 12:13:30.016917 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:30 crc kubenswrapper[4975]: E0318 12:13:30.017018 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:30 crc kubenswrapper[4975]: E0318 12:13:30.137624 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:13:30 crc kubenswrapper[4975]: I0318 12:13:30.850545 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9j7f_add6c8de-77cd-42e7-bf06-d2333b9392ea/kube-multus/1.log" Mar 18 12:13:30 crc kubenswrapper[4975]: I0318 12:13:30.851231 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9j7f_add6c8de-77cd-42e7-bf06-d2333b9392ea/kube-multus/0.log" Mar 18 12:13:30 crc kubenswrapper[4975]: I0318 12:13:30.851270 4975 generic.go:334] "Generic (PLEG): container finished" podID="add6c8de-77cd-42e7-bf06-d2333b9392ea" containerID="d611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d" exitCode=1 Mar 18 12:13:30 crc kubenswrapper[4975]: I0318 12:13:30.851296 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9j7f" event={"ID":"add6c8de-77cd-42e7-bf06-d2333b9392ea","Type":"ContainerDied","Data":"d611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d"} Mar 18 12:13:30 crc kubenswrapper[4975]: I0318 12:13:30.851327 4975 scope.go:117] "RemoveContainer" containerID="484bc9964d515708fd04ca6a99ef020018c7fe0b31b716c51915cd48406fd543" Mar 18 12:13:30 crc kubenswrapper[4975]: I0318 12:13:30.852089 4975 scope.go:117] "RemoveContainer" containerID="d611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d" Mar 18 12:13:30 crc kubenswrapper[4975]: E0318 12:13:30.852295 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-n9j7f_openshift-multus(add6c8de-77cd-42e7-bf06-d2333b9392ea)\"" pod="openshift-multus/multus-n9j7f" podUID="add6c8de-77cd-42e7-bf06-d2333b9392ea" Mar 18 12:13:30 crc kubenswrapper[4975]: I0318 12:13:30.875194 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6nhw5" podStartSLOduration=147.875156485 podStartE2EDuration="2m27.875156485s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:12:57.772980607 +0000 UTC m=+163.487381186" watchObservedRunningTime="2026-03-18 12:13:30.875156485 +0000 UTC m=+196.589557064" Mar 18 12:13:31 crc kubenswrapper[4975]: I0318 12:13:31.855891 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9j7f_add6c8de-77cd-42e7-bf06-d2333b9392ea/kube-multus/1.log" Mar 18 12:13:32 crc kubenswrapper[4975]: I0318 12:13:32.016029 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:32 crc kubenswrapper[4975]: I0318 12:13:32.016110 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:32 crc kubenswrapper[4975]: E0318 12:13:32.016214 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:32 crc kubenswrapper[4975]: I0318 12:13:32.016367 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:32 crc kubenswrapper[4975]: E0318 12:13:32.016457 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:32 crc kubenswrapper[4975]: E0318 12:13:32.016586 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:32 crc kubenswrapper[4975]: I0318 12:13:32.017036 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:32 crc kubenswrapper[4975]: E0318 12:13:32.017149 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:32 crc kubenswrapper[4975]: I0318 12:13:32.017440 4975 scope.go:117] "RemoveContainer" containerID="d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1" Mar 18 12:13:32 crc kubenswrapper[4975]: I0318 12:13:32.862330 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovnkube-controller/3.log" Mar 18 12:13:32 crc kubenswrapper[4975]: I0318 12:13:32.865613 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerStarted","Data":"0a61a1711ca92e59c183fdce16eac480765167b6069a15103cdb61de7f8812cd"} Mar 18 12:13:32 crc kubenswrapper[4975]: I0318 12:13:32.866379 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:13:32 crc kubenswrapper[4975]: I0318 12:13:32.899706 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podStartSLOduration=148.899686828 podStartE2EDuration="2m28.899686828s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:32.898329173 +0000 UTC m=+198.612729772" watchObservedRunningTime="2026-03-18 12:13:32.899686828 +0000 UTC m=+198.614087407" Mar 18 12:13:32 crc kubenswrapper[4975]: I0318 12:13:32.971405 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-587nk"] Mar 18 12:13:32 crc kubenswrapper[4975]: I0318 12:13:32.971502 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:32 crc kubenswrapper[4975]: E0318 12:13:32.971937 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:34 crc kubenswrapper[4975]: I0318 12:13:34.015961 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:34 crc kubenswrapper[4975]: I0318 12:13:34.016004 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:34 crc kubenswrapper[4975]: I0318 12:13:34.015977 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:34 crc kubenswrapper[4975]: E0318 12:13:34.016245 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:34 crc kubenswrapper[4975]: E0318 12:13:34.016334 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:34 crc kubenswrapper[4975]: E0318 12:13:34.016479 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:35 crc kubenswrapper[4975]: I0318 12:13:35.016200 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:35 crc kubenswrapper[4975]: E0318 12:13:35.017427 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:35 crc kubenswrapper[4975]: E0318 12:13:35.139151 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:13:36 crc kubenswrapper[4975]: I0318 12:13:36.016249 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:36 crc kubenswrapper[4975]: I0318 12:13:36.016306 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:36 crc kubenswrapper[4975]: E0318 12:13:36.016389 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:36 crc kubenswrapper[4975]: I0318 12:13:36.016623 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:36 crc kubenswrapper[4975]: E0318 12:13:36.016713 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:36 crc kubenswrapper[4975]: E0318 12:13:36.016925 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:37 crc kubenswrapper[4975]: I0318 12:13:37.016472 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:37 crc kubenswrapper[4975]: E0318 12:13:37.016655 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:37 crc kubenswrapper[4975]: I0318 12:13:37.909680 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:37 crc kubenswrapper[4975]: I0318 12:13:37.909813 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:37 crc kubenswrapper[4975]: I0318 12:13:37.909841 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:37 crc kubenswrapper[4975]: E0318 12:13:37.910003 4975 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:13:37 crc kubenswrapper[4975]: E0318 12:13:37.910047 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:15:39.910034536 +0000 UTC m=+325.624435115 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 12:13:37 crc kubenswrapper[4975]: E0318 12:13:37.910189 4975 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:13:37 crc kubenswrapper[4975]: E0318 12:13:37.910232 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 12:15:39.910220761 +0000 UTC m=+325.624621340 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 12:13:37 crc kubenswrapper[4975]: E0318 12:13:37.910374 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:15:39.910340644 +0000 UTC m=+325.624741223 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:38 crc kubenswrapper[4975]: I0318 12:13:38.010611 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:38 crc kubenswrapper[4975]: I0318 12:13:38.010664 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:38 crc kubenswrapper[4975]: E0318 12:13:38.010830 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:13:38 crc kubenswrapper[4975]: E0318 12:13:38.010848 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:13:38 crc kubenswrapper[4975]: E0318 12:13:38.010890 4975 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:13:38 crc kubenswrapper[4975]: E0318 12:13:38.010936 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 12:15:40.010920984 +0000 UTC m=+325.725321573 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:13:38 crc kubenswrapper[4975]: E0318 12:13:38.011147 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 12:13:38 crc kubenswrapper[4975]: E0318 12:13:38.011166 4975 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 12:13:38 crc kubenswrapper[4975]: E0318 12:13:38.011175 4975 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:13:38 crc kubenswrapper[4975]: E0318 12:13:38.011201 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 12:15:40.011192891 +0000 UTC m=+325.725593470 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 12:13:38 crc kubenswrapper[4975]: I0318 12:13:38.015537 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:38 crc kubenswrapper[4975]: I0318 12:13:38.015556 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:38 crc kubenswrapper[4975]: I0318 12:13:38.015537 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:38 crc kubenswrapper[4975]: E0318 12:13:38.015650 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:38 crc kubenswrapper[4975]: E0318 12:13:38.015722 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:38 crc kubenswrapper[4975]: E0318 12:13:38.016043 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:39 crc kubenswrapper[4975]: I0318 12:13:39.015840 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:39 crc kubenswrapper[4975]: E0318 12:13:39.016453 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:40 crc kubenswrapper[4975]: I0318 12:13:40.015901 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:40 crc kubenswrapper[4975]: I0318 12:13:40.015985 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:40 crc kubenswrapper[4975]: E0318 12:13:40.016014 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:40 crc kubenswrapper[4975]: I0318 12:13:40.016087 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:40 crc kubenswrapper[4975]: E0318 12:13:40.016135 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:40 crc kubenswrapper[4975]: E0318 12:13:40.016159 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:40 crc kubenswrapper[4975]: E0318 12:13:40.140678 4975 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 18 12:13:41 crc kubenswrapper[4975]: I0318 12:13:41.016116 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:41 crc kubenswrapper[4975]: E0318 12:13:41.016244 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:42 crc kubenswrapper[4975]: I0318 12:13:42.016154 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:42 crc kubenswrapper[4975]: I0318 12:13:42.016205 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:42 crc kubenswrapper[4975]: I0318 12:13:42.016206 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:42 crc kubenswrapper[4975]: E0318 12:13:42.016296 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:42 crc kubenswrapper[4975]: E0318 12:13:42.016400 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:42 crc kubenswrapper[4975]: E0318 12:13:42.016492 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:42 crc kubenswrapper[4975]: I0318 12:13:42.016829 4975 scope.go:117] "RemoveContainer" containerID="d611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d" Mar 18 12:13:42 crc kubenswrapper[4975]: I0318 12:13:42.896160 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9j7f_add6c8de-77cd-42e7-bf06-d2333b9392ea/kube-multus/1.log" Mar 18 12:13:42 crc kubenswrapper[4975]: I0318 12:13:42.896460 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9j7f" event={"ID":"add6c8de-77cd-42e7-bf06-d2333b9392ea","Type":"ContainerStarted","Data":"eec94e160170d4702c17967a65a0f9bb6acd952d34ba3dcb551e0afebc06d098"} Mar 18 12:13:43 crc kubenswrapper[4975]: I0318 12:13:43.015941 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:43 crc kubenswrapper[4975]: E0318 12:13:43.016154 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:44 crc kubenswrapper[4975]: I0318 12:13:44.015434 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:44 crc kubenswrapper[4975]: I0318 12:13:44.015481 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:44 crc kubenswrapper[4975]: I0318 12:13:44.015439 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:44 crc kubenswrapper[4975]: E0318 12:13:44.015581 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 12:13:44 crc kubenswrapper[4975]: E0318 12:13:44.015633 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 12:13:44 crc kubenswrapper[4975]: E0318 12:13:44.015740 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:13:45 crc kubenswrapper[4975]: I0318 12:13:45.016169 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:45 crc kubenswrapper[4975]: E0318 12:13:45.017122 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-587nk" podUID="a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b" Mar 18 12:13:46 crc kubenswrapper[4975]: I0318 12:13:46.015408 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:13:46 crc kubenswrapper[4975]: I0318 12:13:46.015414 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:13:46 crc kubenswrapper[4975]: I0318 12:13:46.015451 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:13:46 crc kubenswrapper[4975]: I0318 12:13:46.019198 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 12:13:46 crc kubenswrapper[4975]: I0318 12:13:46.019448 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 12:13:46 crc kubenswrapper[4975]: I0318 12:13:46.019601 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 12:13:46 crc kubenswrapper[4975]: I0318 12:13:46.020587 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.015973 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.018331 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.021282 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.950932 4975 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.990041 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-m27h4"] Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.990999 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.991480 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ftkrg"] Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.991991 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5"] Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.992320 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.992006 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.994010 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6hg9j"] Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.994342 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.994650 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r"] Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.995044 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r" Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.995901 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc"] Mar 18 12:13:47 crc kubenswrapper[4975]: I0318 12:13:47.996289 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:47.997605 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5bf2w"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:47.998058 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.016587 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.016795 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.017033 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.017460 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.017600 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.017916 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.018119 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.018300 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.018458 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.018891 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.019018 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.019188 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.019325 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.019502 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.019760 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.019886 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.020047 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.020184 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.020334 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.020470 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.020942 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.021057 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.021338 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.021539 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.021698 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.022002 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.022185 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.022351 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.022464 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.022507 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.022605 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-z69nv"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.022722 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.022982 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.023136 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.023147 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.023334 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mg2g2"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.023355 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.023399 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.023438 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.023740 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.023479 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.023582 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.023685 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.030545 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.030776 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.030964 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.031053 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.031143 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.031223 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.031301 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.031466 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.034992 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.035677 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.035701 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.036221 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.036573 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.036674 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.036742 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.037107 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.037302 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.037470 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.037696 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.037940 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.037992 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.038091 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.038135 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.042058 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.042661 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qhjb2"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.042904 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.043198 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-d4zht"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.043410 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.044513 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-d4zht" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.054145 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.055144 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qhjb2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.057546 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.086014 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.086574 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.088800 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.088934 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.089174 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.089335 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.089538 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.089745 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.090062 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.090133 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.091988 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8nsht"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.092528 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.092669 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.092913 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.093205 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.095014 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.095088 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.095226 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.095481 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.095612 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.095661 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.095736 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.095623 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.095857 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.095962 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.096092 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jbhj2"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.096628 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jbhj2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.096633 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.097146 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-knn48"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.097776 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.098424 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-78ldn"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.098855 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.101054 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.101773 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.102205 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dn59g"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.102758 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.106060 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.107433 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7cqtt"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.109108 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.110556 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c822ed4-6611-4cef-8002-972e9782d403-config\") pod \"route-controller-manager-6576b87f9c-j56l5\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.110608 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/311fa18b-fde1-4390-9682-75c836813f88-console-oauth-config\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.110639 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-console-config\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.110670 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnp6z\" (UniqueName: \"kubernetes.io/projected/1d1be24c-eb6f-4df4-812d-491ea940ee60-kube-api-access-gnp6z\") pod \"openshift-config-operator-7777fb866f-l2jhn\" (UID: \"1d1be24c-eb6f-4df4-812d-491ea940ee60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.110695 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-config\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.110711 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-etcd-client\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.110733 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eabde416-404d-4874-b690-53897068b5cd-audit-dir\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.110779 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.110804 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14b26921-b7c6-4f20-86af-abb1e8eb339e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m2m4r\" (UID: \"14b26921-b7c6-4f20-86af-abb1e8eb339e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.110829 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlg74\" (UniqueName: \"kubernetes.io/projected/ac556a39-897a-44fe-b537-bdaf85c3f437-kube-api-access-nlg74\") pod \"controller-manager-879f6c89f-ftkrg\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.110888 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkd84\" (UniqueName: \"kubernetes.io/projected/22d232b9-7867-4587-9c0b-d6adba1cd8bd-kube-api-access-wkd84\") pod \"machine-api-operator-5694c8668f-6hg9j\" (UID: \"22d232b9-7867-4587-9c0b-d6adba1cd8bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.110915 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eabde416-404d-4874-b690-53897068b5cd-audit-policies\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.110938 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/865e4345-8c11-4402-b673-93658fe66ced-trusted-ca\") pod \"console-operator-58897d9998-5bf2w\" (UID: \"865e4345-8c11-4402-b673-93658fe66ced\") " pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.110976 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-config\") pod \"controller-manager-879f6c89f-ftkrg\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111023 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zpxt\" (UniqueName: \"kubernetes.io/projected/14b26921-b7c6-4f20-86af-abb1e8eb339e-kube-api-access-8zpxt\") pod \"openshift-apiserver-operator-796bbdcf4f-m2m4r\" (UID: \"14b26921-b7c6-4f20-86af-abb1e8eb339e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111060 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/318ad1a7-abe7-4e8d-bf62-cec22711b081-config\") pod \"machine-approver-56656f9798-xrd7l\" (UID: \"318ad1a7-abe7-4e8d-bf62-cec22711b081\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111085 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nw92\" (UniqueName: \"kubernetes.io/projected/318ad1a7-abe7-4e8d-bf62-cec22711b081-kube-api-access-6nw92\") pod \"machine-approver-56656f9798-xrd7l\" (UID: \"318ad1a7-abe7-4e8d-bf62-cec22711b081\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111108 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/eabde416-404d-4874-b690-53897068b5cd-encryption-config\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111134 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111160 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a468175-a610-4d65-8ca9-a22f91d8d3fc-metrics-tls\") pod \"dns-operator-744455d44c-qhjb2\" (UID: \"1a468175-a610-4d65-8ca9-a22f91d8d3fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-qhjb2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111190 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eabde416-404d-4874-b690-53897068b5cd-serving-cert\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111262 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/22d232b9-7867-4587-9c0b-d6adba1cd8bd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6hg9j\" (UID: \"22d232b9-7867-4587-9c0b-d6adba1cd8bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111310 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/865e4345-8c11-4402-b673-93658fe66ced-serving-cert\") pod \"console-operator-58897d9998-5bf2w\" (UID: \"865e4345-8c11-4402-b673-93658fe66ced\") " pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111448 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m72zt\" (UniqueName: \"kubernetes.io/projected/eabde416-404d-4874-b690-53897068b5cd-kube-api-access-m72zt\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111651 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111649 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-trusted-ca-bundle\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111772 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgcgf\" (UniqueName: \"kubernetes.io/projected/1a468175-a610-4d65-8ca9-a22f91d8d3fc-kube-api-access-kgcgf\") pod \"dns-operator-744455d44c-qhjb2\" (UID: \"1a468175-a610-4d65-8ca9-a22f91d8d3fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-qhjb2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111832 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d7pl\" (UniqueName: \"kubernetes.io/projected/865e4345-8c11-4402-b673-93658fe66ced-kube-api-access-2d7pl\") pod \"console-operator-58897d9998-5bf2w\" (UID: \"865e4345-8c11-4402-b673-93658fe66ced\") " pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111858 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac556a39-897a-44fe-b537-bdaf85c3f437-serving-cert\") pod \"controller-manager-879f6c89f-ftkrg\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111896 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22d232b9-7867-4587-9c0b-d6adba1cd8bd-config\") pod \"machine-api-operator-5694c8668f-6hg9j\" (UID: \"22d232b9-7867-4587-9c0b-d6adba1cd8bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111920 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-oauth-serving-cert\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111945 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh2c2\" (UniqueName: \"kubernetes.io/projected/b2b5a18f-c7da-47d4-b3a3-2a3917e63c89-kube-api-access-dh2c2\") pod \"openshift-controller-manager-operator-756b6f6bc6-lt2wh\" (UID: \"b2b5a18f-c7da-47d4-b3a3-2a3917e63c89\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111972 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c822ed4-6611-4cef-8002-972e9782d403-serving-cert\") pod \"route-controller-manager-6576b87f9c-j56l5\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.111994 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112015 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/318ad1a7-abe7-4e8d-bf62-cec22711b081-auth-proxy-config\") pod \"machine-approver-56656f9798-xrd7l\" (UID: \"318ad1a7-abe7-4e8d-bf62-cec22711b081\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112034 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/eabde416-404d-4874-b690-53897068b5cd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112052 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/865e4345-8c11-4402-b673-93658fe66ced-config\") pod \"console-operator-58897d9998-5bf2w\" (UID: \"865e4345-8c11-4402-b673-93658fe66ced\") " pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112096 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112121 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2b5a18f-c7da-47d4-b3a3-2a3917e63c89-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lt2wh\" (UID: \"b2b5a18f-c7da-47d4-b3a3-2a3917e63c89\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112144 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-node-pullsecrets\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112169 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/311fa18b-fde1-4390-9682-75c836813f88-console-serving-cert\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112190 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs9ss\" (UniqueName: \"kubernetes.io/projected/311fa18b-fde1-4390-9682-75c836813f88-kube-api-access-cs9ss\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112233 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grm7t\" (UniqueName: \"kubernetes.io/projected/5c822ed4-6611-4cef-8002-972e9782d403-kube-api-access-grm7t\") pod \"route-controller-manager-6576b87f9c-j56l5\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112254 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-client-ca\") pod \"controller-manager-879f6c89f-ftkrg\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112273 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14d667ef-8c80-42f5-b119-1bae87e39be7-audit-dir\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112299 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eabde416-404d-4874-b690-53897068b5cd-etcd-client\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112320 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-etcd-serving-ca\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112386 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d1be24c-eb6f-4df4-812d-491ea940ee60-serving-cert\") pod \"openshift-config-operator-7777fb866f-l2jhn\" (UID: \"1d1be24c-eb6f-4df4-812d-491ea940ee60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112408 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/318ad1a7-abe7-4e8d-bf62-cec22711b081-machine-approver-tls\") pod \"machine-approver-56656f9798-xrd7l\" (UID: \"318ad1a7-abe7-4e8d-bf62-cec22711b081\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112446 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-audit-dir\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112488 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112509 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-service-ca\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112532 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1d1be24c-eb6f-4df4-812d-491ea940ee60-available-featuregates\") pod \"openshift-config-operator-7777fb866f-l2jhn\" (UID: \"1d1be24c-eb6f-4df4-812d-491ea940ee60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112553 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-audit\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112594 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c822ed4-6611-4cef-8002-972e9782d403-client-ca\") pod \"route-controller-manager-6576b87f9c-j56l5\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112617 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112638 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112661 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-serving-cert\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112683 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ftkrg\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112703 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112726 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112747 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbnhx\" (UniqueName: \"kubernetes.io/projected/14d667ef-8c80-42f5-b119-1bae87e39be7-kube-api-access-vbnhx\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112788 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-audit-policies\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112811 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-image-import-ca\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112837 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2b5a18f-c7da-47d4-b3a3-2a3917e63c89-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lt2wh\" (UID: \"b2b5a18f-c7da-47d4-b3a3-2a3917e63c89\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112882 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eabde416-404d-4874-b690-53897068b5cd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112908 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsp9l\" (UniqueName: \"kubernetes.io/projected/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-kube-api-access-jsp9l\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112913 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112933 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/22d232b9-7867-4587-9c0b-d6adba1cd8bd-images\") pod \"machine-api-operator-5694c8668f-6hg9j\" (UID: \"22d232b9-7867-4587-9c0b-d6adba1cd8bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112966 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14b26921-b7c6-4f20-86af-abb1e8eb339e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m2m4r\" (UID: \"14b26921-b7c6-4f20-86af-abb1e8eb339e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.112988 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.113011 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgnzb\" (UniqueName: \"kubernetes.io/projected/36e42dcf-4953-46b4-8459-e2e72e03895c-kube-api-access-bgnzb\") pod \"downloads-7954f5f757-d4zht\" (UID: \"36e42dcf-4953-46b4-8459-e2e72e03895c\") " pod="openshift-console/downloads-7954f5f757-d4zht" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.113037 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-encryption-config\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.113061 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.113088 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.113673 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.114577 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.116107 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.116219 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.117728 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.117769 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.118674 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.119040 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.119459 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.119875 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.121327 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.122478 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.122787 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkb4h"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.123504 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkb4h" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.144369 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.144638 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.145819 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.146730 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.147010 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.149598 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.150688 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.152949 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.168587 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.181227 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.181701 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.181964 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.182080 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ftkrg"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.182198 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.182422 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q2bgt"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.183160 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.183471 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.184240 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.184498 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.185336 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.185974 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.186414 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563932-9m82r"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.187272 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563932-9m82r" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.187551 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dtv2h"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.188368 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtv2h" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.189041 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.189620 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.191829 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.194167 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.194822 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.194981 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.196118 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.196141 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.196210 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.196629 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.196706 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.198169 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.198610 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.199834 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xth7t"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.213495 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214290 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c822ed4-6611-4cef-8002-972e9782d403-config\") pod \"route-controller-manager-6576b87f9c-j56l5\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214314 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/311fa18b-fde1-4390-9682-75c836813f88-console-oauth-config\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214333 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-console-config\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214356 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v5rz\" (UniqueName: \"kubernetes.io/projected/907a3641-9861-4891-a145-a0d36cb413b3-kube-api-access-4v5rz\") pod \"service-ca-operator-777779d784-ln9d6\" (UID: \"907a3641-9861-4891-a145-a0d36cb413b3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214374 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03aa4bc1-7712-418f-b56d-9686e85ba1d2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-clfc4\" (UID: \"03aa4bc1-7712-418f-b56d-9686e85ba1d2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214391 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnp6z\" (UniqueName: \"kubernetes.io/projected/1d1be24c-eb6f-4df4-812d-491ea940ee60-kube-api-access-gnp6z\") pod \"openshift-config-operator-7777fb866f-l2jhn\" (UID: \"1d1be24c-eb6f-4df4-812d-491ea940ee60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214407 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-config\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214421 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-etcd-client\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214435 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eabde416-404d-4874-b690-53897068b5cd-audit-dir\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214459 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/907a3641-9861-4891-a145-a0d36cb413b3-config\") pod \"service-ca-operator-777779d784-ln9d6\" (UID: \"907a3641-9861-4891-a145-a0d36cb413b3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214476 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wrn8m\" (UID: \"bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214493 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214517 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14b26921-b7c6-4f20-86af-abb1e8eb339e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m2m4r\" (UID: \"14b26921-b7c6-4f20-86af-abb1e8eb339e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214533 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlg74\" (UniqueName: \"kubernetes.io/projected/ac556a39-897a-44fe-b537-bdaf85c3f437-kube-api-access-nlg74\") pod \"controller-manager-879f6c89f-ftkrg\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214549 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3ffcf02-5ead-4e06-b402-9b48c21f2d36-proxy-tls\") pod \"machine-config-controller-84d6567774-9cnqk\" (UID: \"b3ffcf02-5ead-4e06-b402-9b48c21f2d36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214565 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkd84\" (UniqueName: \"kubernetes.io/projected/22d232b9-7867-4587-9c0b-d6adba1cd8bd-kube-api-access-wkd84\") pod \"machine-api-operator-5694c8668f-6hg9j\" (UID: \"22d232b9-7867-4587-9c0b-d6adba1cd8bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214581 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eabde416-404d-4874-b690-53897068b5cd-audit-policies\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214596 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/865e4345-8c11-4402-b673-93658fe66ced-trusted-ca\") pod \"console-operator-58897d9998-5bf2w\" (UID: \"865e4345-8c11-4402-b673-93658fe66ced\") " pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214614 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zpxt\" (UniqueName: \"kubernetes.io/projected/14b26921-b7c6-4f20-86af-abb1e8eb339e-kube-api-access-8zpxt\") pod \"openshift-apiserver-operator-796bbdcf4f-m2m4r\" (UID: \"14b26921-b7c6-4f20-86af-abb1e8eb339e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214638 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-config\") pod \"controller-manager-879f6c89f-ftkrg\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214660 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/318ad1a7-abe7-4e8d-bf62-cec22711b081-config\") pod \"machine-approver-56656f9798-xrd7l\" (UID: \"318ad1a7-abe7-4e8d-bf62-cec22711b081\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214686 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nw92\" (UniqueName: \"kubernetes.io/projected/318ad1a7-abe7-4e8d-bf62-cec22711b081-kube-api-access-6nw92\") pod \"machine-approver-56656f9798-xrd7l\" (UID: \"318ad1a7-abe7-4e8d-bf62-cec22711b081\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214702 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/eabde416-404d-4874-b690-53897068b5cd-encryption-config\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214726 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/22d232b9-7867-4587-9c0b-d6adba1cd8bd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6hg9j\" (UID: \"22d232b9-7867-4587-9c0b-d6adba1cd8bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214748 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214769 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a468175-a610-4d65-8ca9-a22f91d8d3fc-metrics-tls\") pod \"dns-operator-744455d44c-qhjb2\" (UID: \"1a468175-a610-4d65-8ca9-a22f91d8d3fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-qhjb2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214788 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eabde416-404d-4874-b690-53897068b5cd-serving-cert\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214812 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5mww\" (UniqueName: \"kubernetes.io/projected/bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc-kube-api-access-b5mww\") pod \"cluster-image-registry-operator-dc59b4c8b-wrn8m\" (UID: \"bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214837 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m72zt\" (UniqueName: \"kubernetes.io/projected/eabde416-404d-4874-b690-53897068b5cd-kube-api-access-m72zt\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214859 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/865e4345-8c11-4402-b673-93658fe66ced-serving-cert\") pod \"console-operator-58897d9998-5bf2w\" (UID: \"865e4345-8c11-4402-b673-93658fe66ced\") " pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214913 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-trusted-ca-bundle\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214935 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgcgf\" (UniqueName: \"kubernetes.io/projected/1a468175-a610-4d65-8ca9-a22f91d8d3fc-kube-api-access-kgcgf\") pod \"dns-operator-744455d44c-qhjb2\" (UID: \"1a468175-a610-4d65-8ca9-a22f91d8d3fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-qhjb2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214959 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d7pl\" (UniqueName: \"kubernetes.io/projected/865e4345-8c11-4402-b673-93658fe66ced-kube-api-access-2d7pl\") pod \"console-operator-58897d9998-5bf2w\" (UID: \"865e4345-8c11-4402-b673-93658fe66ced\") " pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.214988 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c822ed4-6611-4cef-8002-972e9782d403-serving-cert\") pod \"route-controller-manager-6576b87f9c-j56l5\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215003 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac556a39-897a-44fe-b537-bdaf85c3f437-serving-cert\") pod \"controller-manager-879f6c89f-ftkrg\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215020 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22d232b9-7867-4587-9c0b-d6adba1cd8bd-config\") pod \"machine-api-operator-5694c8668f-6hg9j\" (UID: \"22d232b9-7867-4587-9c0b-d6adba1cd8bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215034 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-oauth-serving-cert\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215061 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh2c2\" (UniqueName: \"kubernetes.io/projected/b2b5a18f-c7da-47d4-b3a3-2a3917e63c89-kube-api-access-dh2c2\") pod \"openshift-controller-manager-operator-756b6f6bc6-lt2wh\" (UID: \"b2b5a18f-c7da-47d4-b3a3-2a3917e63c89\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215077 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215091 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/318ad1a7-abe7-4e8d-bf62-cec22711b081-auth-proxy-config\") pod \"machine-approver-56656f9798-xrd7l\" (UID: \"318ad1a7-abe7-4e8d-bf62-cec22711b081\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215112 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/eabde416-404d-4874-b690-53897068b5cd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215127 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/865e4345-8c11-4402-b673-93658fe66ced-config\") pod \"console-operator-58897d9998-5bf2w\" (UID: \"865e4345-8c11-4402-b673-93658fe66ced\") " pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215144 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wrn8m\" (UID: \"bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215162 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215178 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2b5a18f-c7da-47d4-b3a3-2a3917e63c89-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lt2wh\" (UID: \"b2b5a18f-c7da-47d4-b3a3-2a3917e63c89\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215193 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-node-pullsecrets\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215208 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wrn8m\" (UID: \"bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215225 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/311fa18b-fde1-4390-9682-75c836813f88-console-serving-cert\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215242 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs9ss\" (UniqueName: \"kubernetes.io/projected/311fa18b-fde1-4390-9682-75c836813f88-kube-api-access-cs9ss\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215265 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grm7t\" (UniqueName: \"kubernetes.io/projected/5c822ed4-6611-4cef-8002-972e9782d403-kube-api-access-grm7t\") pod \"route-controller-manager-6576b87f9c-j56l5\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215282 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-client-ca\") pod \"controller-manager-879f6c89f-ftkrg\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215302 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14d667ef-8c80-42f5-b119-1bae87e39be7-audit-dir\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215324 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eabde416-404d-4874-b690-53897068b5cd-etcd-client\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215348 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03aa4bc1-7712-418f-b56d-9686e85ba1d2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-clfc4\" (UID: \"03aa4bc1-7712-418f-b56d-9686e85ba1d2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215371 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-etcd-serving-ca\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215394 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215414 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d1be24c-eb6f-4df4-812d-491ea940ee60-serving-cert\") pod \"openshift-config-operator-7777fb866f-l2jhn\" (UID: \"1d1be24c-eb6f-4df4-812d-491ea940ee60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215433 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/318ad1a7-abe7-4e8d-bf62-cec22711b081-machine-approver-tls\") pod \"machine-approver-56656f9798-xrd7l\" (UID: \"318ad1a7-abe7-4e8d-bf62-cec22711b081\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215453 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-audit-dir\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215477 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-service-ca\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215501 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1d1be24c-eb6f-4df4-812d-491ea940ee60-available-featuregates\") pod \"openshift-config-operator-7777fb866f-l2jhn\" (UID: \"1d1be24c-eb6f-4df4-812d-491ea940ee60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215515 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-audit\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215534 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c822ed4-6611-4cef-8002-972e9782d403-client-ca\") pod \"route-controller-manager-6576b87f9c-j56l5\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215555 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215575 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215596 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215617 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-serving-cert\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215641 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ftkrg\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215665 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215680 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbnhx\" (UniqueName: \"kubernetes.io/projected/14d667ef-8c80-42f5-b119-1bae87e39be7-kube-api-access-vbnhx\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215702 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b3ffcf02-5ead-4e06-b402-9b48c21f2d36-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9cnqk\" (UID: \"b3ffcf02-5ead-4e06-b402-9b48c21f2d36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215738 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-image-import-ca\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215763 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-audit-policies\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215783 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2b5a18f-c7da-47d4-b3a3-2a3917e63c89-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lt2wh\" (UID: \"b2b5a18f-c7da-47d4-b3a3-2a3917e63c89\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215803 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/907a3641-9861-4891-a145-a0d36cb413b3-serving-cert\") pod \"service-ca-operator-777779d784-ln9d6\" (UID: \"907a3641-9861-4891-a145-a0d36cb413b3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215825 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03aa4bc1-7712-418f-b56d-9686e85ba1d2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-clfc4\" (UID: \"03aa4bc1-7712-418f-b56d-9686e85ba1d2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215850 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsp9l\" (UniqueName: \"kubernetes.io/projected/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-kube-api-access-jsp9l\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215892 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eabde416-404d-4874-b690-53897068b5cd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215918 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/22d232b9-7867-4587-9c0b-d6adba1cd8bd-images\") pod \"machine-api-operator-5694c8668f-6hg9j\" (UID: \"22d232b9-7867-4587-9c0b-d6adba1cd8bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215940 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjdp7\" (UniqueName: \"kubernetes.io/projected/b3ffcf02-5ead-4e06-b402-9b48c21f2d36-kube-api-access-rjdp7\") pod \"machine-config-controller-84d6567774-9cnqk\" (UID: \"b3ffcf02-5ead-4e06-b402-9b48c21f2d36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215966 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14b26921-b7c6-4f20-86af-abb1e8eb339e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m2m4r\" (UID: \"14b26921-b7c6-4f20-86af-abb1e8eb339e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.215989 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.216014 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgnzb\" (UniqueName: \"kubernetes.io/projected/36e42dcf-4953-46b4-8459-e2e72e03895c-kube-api-access-bgnzb\") pod \"downloads-7954f5f757-d4zht\" (UID: \"36e42dcf-4953-46b4-8459-e2e72e03895c\") " pod="openshift-console/downloads-7954f5f757-d4zht" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.216039 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-encryption-config\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.216062 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.216085 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.216120 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cxbbs"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.216966 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6hg9j"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.216994 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.217014 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.217635 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-m27h4"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.217659 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-v7w9z"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.218337 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c822ed4-6611-4cef-8002-972e9782d403-config\") pod \"route-controller-manager-6576b87f9c-j56l5\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.219013 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-z69nv"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.219067 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.219082 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mg2g2"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.219095 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.219032 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xth7t" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.219386 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.219662 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-cxbbs" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.219801 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.219892 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2b5a18f-c7da-47d4-b3a3-2a3917e63c89-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lt2wh\" (UID: \"b2b5a18f-c7da-47d4-b3a3-2a3917e63c89\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.219898 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.220037 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-node-pullsecrets\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.220057 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.220586 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-console-config\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.220800 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-config\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.220924 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eabde416-404d-4874-b690-53897068b5cd-audit-dir\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.221007 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-d4zht"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.221008 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-client-ca\") pod \"controller-manager-879f6c89f-ftkrg\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.221046 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14d667ef-8c80-42f5-b119-1bae87e39be7-audit-dir\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.221563 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-etcd-serving-ca\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.221751 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eabde416-404d-4874-b690-53897068b5cd-audit-policies\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.221977 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.222986 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/865e4345-8c11-4402-b673-93658fe66ced-trusted-ca\") pod \"console-operator-58897d9998-5bf2w\" (UID: \"865e4345-8c11-4402-b673-93658fe66ced\") " pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.224101 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-config\") pod \"controller-manager-879f6c89f-ftkrg\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.224610 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.224632 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5bf2w"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.224941 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/318ad1a7-abe7-4e8d-bf62-cec22711b081-config\") pod \"machine-approver-56656f9798-xrd7l\" (UID: \"318ad1a7-abe7-4e8d-bf62-cec22711b081\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.230595 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkb4h"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.230648 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.230663 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-knn48"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.231315 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/311fa18b-fde1-4390-9682-75c836813f88-console-oauth-config\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.232353 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1d1be24c-eb6f-4df4-812d-491ea940ee60-available-featuregates\") pod \"openshift-config-operator-7777fb866f-l2jhn\" (UID: \"1d1be24c-eb6f-4df4-812d-491ea940ee60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.232638 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.232709 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-audit-dir\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.235891 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-audit\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.240034 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eabde416-404d-4874-b690-53897068b5cd-etcd-client\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.240326 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.240502 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac556a39-897a-44fe-b537-bdaf85c3f437-serving-cert\") pod \"controller-manager-879f6c89f-ftkrg\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.240793 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ftkrg\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.240917 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c822ed4-6611-4cef-8002-972e9782d403-client-ca\") pod \"route-controller-manager-6576b87f9c-j56l5\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.241388 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dtv2h"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.241899 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.242236 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-image-import-ca\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.242818 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-audit-policies\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.243496 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eabde416-404d-4874-b690-53897068b5cd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.243993 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-service-ca\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.244572 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/22d232b9-7867-4587-9c0b-d6adba1cd8bd-images\") pod \"machine-api-operator-5694c8668f-6hg9j\" (UID: \"22d232b9-7867-4587-9c0b-d6adba1cd8bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.244636 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.245279 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14b26921-b7c6-4f20-86af-abb1e8eb339e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m2m4r\" (UID: \"14b26921-b7c6-4f20-86af-abb1e8eb339e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.244958 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/311fa18b-fde1-4390-9682-75c836813f88-console-serving-cert\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.246221 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-etcd-client\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.246401 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/318ad1a7-abe7-4e8d-bf62-cec22711b081-machine-approver-tls\") pod \"machine-approver-56656f9798-xrd7l\" (UID: \"318ad1a7-abe7-4e8d-bf62-cec22711b081\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.246761 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-trusted-ca-bundle\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.247393 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-oauth-serving-cert\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.248063 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22d232b9-7867-4587-9c0b-d6adba1cd8bd-config\") pod \"machine-api-operator-5694c8668f-6hg9j\" (UID: \"22d232b9-7867-4587-9c0b-d6adba1cd8bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.249761 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/865e4345-8c11-4402-b673-93658fe66ced-config\") pod \"console-operator-58897d9998-5bf2w\" (UID: \"865e4345-8c11-4402-b673-93658fe66ced\") " pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.249793 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.250399 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-serving-cert\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.251019 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-encryption-config\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.251286 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/eabde416-404d-4874-b690-53897068b5cd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.251567 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.251802 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.252559 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.252792 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2b5a18f-c7da-47d4-b3a3-2a3917e63c89-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lt2wh\" (UID: \"b2b5a18f-c7da-47d4-b3a3-2a3917e63c89\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.252799 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14b26921-b7c6-4f20-86af-abb1e8eb339e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m2m4r\" (UID: \"14b26921-b7c6-4f20-86af-abb1e8eb339e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.253431 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.254165 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/865e4345-8c11-4402-b673-93658fe66ced-serving-cert\") pod \"console-operator-58897d9998-5bf2w\" (UID: \"865e4345-8c11-4402-b673-93658fe66ced\") " pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.254567 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eabde416-404d-4874-b690-53897068b5cd-serving-cert\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.255546 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.255891 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/318ad1a7-abe7-4e8d-bf62-cec22711b081-auth-proxy-config\") pod \"machine-approver-56656f9798-xrd7l\" (UID: \"318ad1a7-abe7-4e8d-bf62-cec22711b081\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.258473 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.259135 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c822ed4-6611-4cef-8002-972e9782d403-serving-cert\") pod \"route-controller-manager-6576b87f9c-j56l5\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.264580 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d1be24c-eb6f-4df4-812d-491ea940ee60-serving-cert\") pod \"openshift-config-operator-7777fb866f-l2jhn\" (UID: \"1d1be24c-eb6f-4df4-812d-491ea940ee60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.265042 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qhjb2"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.265066 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a468175-a610-4d65-8ca9-a22f91d8d3fc-metrics-tls\") pod \"dns-operator-744455d44c-qhjb2\" (UID: \"1a468175-a610-4d65-8ca9-a22f91d8d3fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-qhjb2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.265484 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/22d232b9-7867-4587-9c0b-d6adba1cd8bd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6hg9j\" (UID: \"22d232b9-7867-4587-9c0b-d6adba1cd8bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.266028 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/eabde416-404d-4874-b690-53897068b5cd-encryption-config\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.266541 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.268326 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8nsht"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.268692 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.270480 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jbhj2"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.272077 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q2bgt"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.273340 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dn59g"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.274129 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.274651 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.275908 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xth7t"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.277317 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nlw8k"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.279243 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.279402 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nlw8k" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.279674 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lzvfs"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.280766 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lzvfs" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.281294 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563932-9m82r"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.282443 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.283619 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.284779 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.285950 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.288010 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.291394 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7cqtt"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.293593 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.294453 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-v7w9z"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.297061 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.299167 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nlw8k"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.312517 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.314522 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lzvfs"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.314620 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.316751 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cxbbs"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.317119 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b3ffcf02-5ead-4e06-b402-9b48c21f2d36-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9cnqk\" (UID: \"b3ffcf02-5ead-4e06-b402-9b48c21f2d36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.317160 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03aa4bc1-7712-418f-b56d-9686e85ba1d2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-clfc4\" (UID: \"03aa4bc1-7712-418f-b56d-9686e85ba1d2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.317191 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/907a3641-9861-4891-a145-a0d36cb413b3-serving-cert\") pod \"service-ca-operator-777779d784-ln9d6\" (UID: \"907a3641-9861-4891-a145-a0d36cb413b3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.317225 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjdp7\" (UniqueName: \"kubernetes.io/projected/b3ffcf02-5ead-4e06-b402-9b48c21f2d36-kube-api-access-rjdp7\") pod \"machine-config-controller-84d6567774-9cnqk\" (UID: \"b3ffcf02-5ead-4e06-b402-9b48c21f2d36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.317265 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v5rz\" (UniqueName: \"kubernetes.io/projected/907a3641-9861-4891-a145-a0d36cb413b3-kube-api-access-4v5rz\") pod \"service-ca-operator-777779d784-ln9d6\" (UID: \"907a3641-9861-4891-a145-a0d36cb413b3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.317286 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03aa4bc1-7712-418f-b56d-9686e85ba1d2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-clfc4\" (UID: \"03aa4bc1-7712-418f-b56d-9686e85ba1d2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.317310 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/907a3641-9861-4891-a145-a0d36cb413b3-config\") pod \"service-ca-operator-777779d784-ln9d6\" (UID: \"907a3641-9861-4891-a145-a0d36cb413b3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.317340 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wrn8m\" (UID: \"bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.317391 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3ffcf02-5ead-4e06-b402-9b48c21f2d36-proxy-tls\") pod \"machine-config-controller-84d6567774-9cnqk\" (UID: \"b3ffcf02-5ead-4e06-b402-9b48c21f2d36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.317440 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5mww\" (UniqueName: \"kubernetes.io/projected/bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc-kube-api-access-b5mww\") pod \"cluster-image-registry-operator-dc59b4c8b-wrn8m\" (UID: \"bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.317526 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wrn8m\" (UID: \"bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.317558 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wrn8m\" (UID: \"bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.317608 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03aa4bc1-7712-418f-b56d-9686e85ba1d2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-clfc4\" (UID: \"03aa4bc1-7712-418f-b56d-9686e85ba1d2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.318240 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b3ffcf02-5ead-4e06-b402-9b48c21f2d36-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9cnqk\" (UID: \"b3ffcf02-5ead-4e06-b402-9b48c21f2d36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.318671 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wrn8m\" (UID: \"bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.320187 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-ww7rx"] Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.321129 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ww7rx" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.333575 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.353698 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.373675 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.393482 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.413714 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.433305 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.453366 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.461924 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03aa4bc1-7712-418f-b56d-9686e85ba1d2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-clfc4\" (UID: \"03aa4bc1-7712-418f-b56d-9686e85ba1d2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.474335 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.493641 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.497920 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03aa4bc1-7712-418f-b56d-9686e85ba1d2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-clfc4\" (UID: \"03aa4bc1-7712-418f-b56d-9686e85ba1d2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.513978 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.536309 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.553435 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.574233 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.593830 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.613717 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.634360 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.653129 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.698643 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.712097 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3ffcf02-5ead-4e06-b402-9b48c21f2d36-proxy-tls\") pod \"machine-config-controller-84d6567774-9cnqk\" (UID: \"b3ffcf02-5ead-4e06-b402-9b48c21f2d36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.713783 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.733494 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.753476 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.774024 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.793643 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.813700 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.833148 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.854129 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.874970 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.893697 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.913440 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.933442 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.958691 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.973175 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 12:13:48 crc kubenswrapper[4975]: I0318 12:13:48.993482 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.013556 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.034230 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.054221 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.073792 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.093915 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.113664 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.139742 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.153899 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.174725 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.191733 4975 request.go:700] Waited for 1.006939803s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0 Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.193456 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.202224 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/907a3641-9861-4891-a145-a0d36cb413b3-serving-cert\") pod \"service-ca-operator-777779d784-ln9d6\" (UID: \"907a3641-9861-4891-a145-a0d36cb413b3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.214621 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.234044 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.253956 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.258618 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/907a3641-9861-4891-a145-a0d36cb413b3-config\") pod \"service-ca-operator-777779d784-ln9d6\" (UID: \"907a3641-9861-4891-a145-a0d36cb413b3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.274412 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.293924 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.314242 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 12:13:49 crc kubenswrapper[4975]: E0318 12:13:49.318528 4975 secret.go:188] Couldn't get secret openshift-image-registry/image-registry-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 18 12:13:49 crc kubenswrapper[4975]: E0318 12:13:49.318620 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc-image-registry-operator-tls podName:bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc nodeName:}" failed. No retries permitted until 2026-03-18 12:13:49.818598353 +0000 UTC m=+215.532998932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc-image-registry-operator-tls") pod "cluster-image-registry-operator-dc59b4c8b-wrn8m" (UID: "bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc") : failed to sync secret cache: timed out waiting for the condition Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.333648 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.353985 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.374021 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.394060 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.413488 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.433531 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.454465 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.474896 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.514499 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.534028 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.555116 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.573438 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.594384 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.614810 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.634275 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.653746 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.674209 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.695268 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.714052 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.734315 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.767804 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnp6z\" (UniqueName: \"kubernetes.io/projected/1d1be24c-eb6f-4df4-812d-491ea940ee60-kube-api-access-gnp6z\") pod \"openshift-config-operator-7777fb866f-l2jhn\" (UID: \"1d1be24c-eb6f-4df4-812d-491ea940ee60\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.788502 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs9ss\" (UniqueName: \"kubernetes.io/projected/311fa18b-fde1-4390-9682-75c836813f88-kube-api-access-cs9ss\") pod \"console-f9d7485db-z69nv\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.807225 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grm7t\" (UniqueName: \"kubernetes.io/projected/5c822ed4-6611-4cef-8002-972e9782d403-kube-api-access-grm7t\") pod \"route-controller-manager-6576b87f9c-j56l5\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.821701 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.831140 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlg74\" (UniqueName: \"kubernetes.io/projected/ac556a39-897a-44fe-b537-bdaf85c3f437-kube-api-access-nlg74\") pod \"controller-manager-879f6c89f-ftkrg\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.835480 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wrn8m\" (UID: \"bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.840114 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wrn8m\" (UID: \"bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.842983 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.843415 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.848314 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkd84\" (UniqueName: \"kubernetes.io/projected/22d232b9-7867-4587-9c0b-d6adba1cd8bd-kube-api-access-wkd84\") pod \"machine-api-operator-5694c8668f-6hg9j\" (UID: \"22d232b9-7867-4587-9c0b-d6adba1cd8bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.869424 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zpxt\" (UniqueName: \"kubernetes.io/projected/14b26921-b7c6-4f20-86af-abb1e8eb339e-kube-api-access-8zpxt\") pod \"openshift-apiserver-operator-796bbdcf4f-m2m4r\" (UID: \"14b26921-b7c6-4f20-86af-abb1e8eb339e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.890330 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nw92\" (UniqueName: \"kubernetes.io/projected/318ad1a7-abe7-4e8d-bf62-cec22711b081-kube-api-access-6nw92\") pod \"machine-approver-56656f9798-xrd7l\" (UID: \"318ad1a7-abe7-4e8d-bf62-cec22711b081\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.893662 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.910165 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.913530 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.934708 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.954358 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.987397 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r" Mar 18 12:13:49 crc kubenswrapper[4975]: I0318 12:13:49.990733 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbnhx\" (UniqueName: \"kubernetes.io/projected/14d667ef-8c80-42f5-b119-1bae87e39be7-kube-api-access-vbnhx\") pod \"oauth-openshift-558db77b4-mg2g2\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.014355 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsp9l\" (UniqueName: \"kubernetes.io/projected/bbf4b50a-de39-4b86-b0ca-883ba11d6e4b-kube-api-access-jsp9l\") pod \"apiserver-76f77b778f-m27h4\" (UID: \"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b\") " pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.027761 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.029475 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m72zt\" (UniqueName: \"kubernetes.io/projected/eabde416-404d-4874-b690-53897068b5cd-kube-api-access-m72zt\") pod \"apiserver-7bbb656c7d-9bsnc\" (UID: \"eabde416-404d-4874-b690-53897068b5cd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.049926 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn"] Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.051143 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgnzb\" (UniqueName: \"kubernetes.io/projected/36e42dcf-4953-46b4-8459-e2e72e03895c-kube-api-access-bgnzb\") pod \"downloads-7954f5f757-d4zht\" (UID: \"36e42dcf-4953-46b4-8459-e2e72e03895c\") " pod="openshift-console/downloads-7954f5f757-d4zht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.069491 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgcgf\" (UniqueName: \"kubernetes.io/projected/1a468175-a610-4d65-8ca9-a22f91d8d3fc-kube-api-access-kgcgf\") pod \"dns-operator-744455d44c-qhjb2\" (UID: \"1a468175-a610-4d65-8ca9-a22f91d8d3fc\") " pod="openshift-dns-operator/dns-operator-744455d44c-qhjb2" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.074637 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.094788 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.108628 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.110308 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.110477 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d7pl\" (UniqueName: \"kubernetes.io/projected/865e4345-8c11-4402-b673-93658fe66ced-kube-api-access-2d7pl\") pod \"console-operator-58897d9998-5bf2w\" (UID: \"865e4345-8c11-4402-b673-93658fe66ced\") " pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.126193 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-d4zht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.132187 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh2c2\" (UniqueName: \"kubernetes.io/projected/b2b5a18f-c7da-47d4-b3a3-2a3917e63c89-kube-api-access-dh2c2\") pod \"openshift-controller-manager-operator-756b6f6bc6-lt2wh\" (UID: \"b2b5a18f-c7da-47d4-b3a3-2a3917e63c89\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.134640 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qhjb2" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.137853 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.153694 4975 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.153749 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6hg9j"] Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.173382 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 12:13:50 crc kubenswrapper[4975]: W0318 12:13:50.177677 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22d232b9_7867_4587_9c0b_d6adba1cd8bd.slice/crio-b60f3646916cf6359eb56edcabcc4544bc07c961f643b3805ee44b959bbed43d WatchSource:0}: Error finding container b60f3646916cf6359eb56edcabcc4544bc07c961f643b3805ee44b959bbed43d: Status 404 returned error can't find the container with id b60f3646916cf6359eb56edcabcc4544bc07c961f643b3805ee44b959bbed43d Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.193238 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.212175 4975 request.go:700] Waited for 1.93228749s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.213449 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.235551 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.251486 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r"] Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.254044 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.274751 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 12:13:50 crc kubenswrapper[4975]: W0318 12:13:50.292075 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14b26921_b7c6_4f20_86af_abb1e8eb339e.slice/crio-737cd7913eefdd7fafebfd1e8bc4880b98400d99519bfbb95a322aabdab04520 WatchSource:0}: Error finding container 737cd7913eefdd7fafebfd1e8bc4880b98400d99519bfbb95a322aabdab04520: Status 404 returned error can't find the container with id 737cd7913eefdd7fafebfd1e8bc4880b98400d99519bfbb95a322aabdab04520 Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.293633 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.303456 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-z69nv"] Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.303785 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.308468 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5"] Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.308645 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.309685 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ftkrg"] Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.313686 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 12:13:50 crc kubenswrapper[4975]: W0318 12:13:50.327402 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod311fa18b_fde1_4390_9682_75c836813f88.slice/crio-6e9368fcebd49c4ca14fd16f23260321c87a809486ce39310f1efa5b61c5e119 WatchSource:0}: Error finding container 6e9368fcebd49c4ca14fd16f23260321c87a809486ce39310f1efa5b61c5e119: Status 404 returned error can't find the container with id 6e9368fcebd49c4ca14fd16f23260321c87a809486ce39310f1efa5b61c5e119 Mar 18 12:13:50 crc kubenswrapper[4975]: W0318 12:13:50.329152 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac556a39_897a_44fe_b537_bdaf85c3f437.slice/crio-72110e8a1ee3e19ff58e3dbe5fd242a608f2627369956ac78a56d3d523cd5155 WatchSource:0}: Error finding container 72110e8a1ee3e19ff58e3dbe5fd242a608f2627369956ac78a56d3d523cd5155: Status 404 returned error can't find the container with id 72110e8a1ee3e19ff58e3dbe5fd242a608f2627369956ac78a56d3d523cd5155 Mar 18 12:13:50 crc kubenswrapper[4975]: W0318 12:13:50.331295 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c822ed4_6611_4cef_8002_972e9782d403.slice/crio-2f915afce75dd29deb2e2fedd8150b04418ab12947377011bb0a7e319947b426 WatchSource:0}: Error finding container 2f915afce75dd29deb2e2fedd8150b04418ab12947377011bb0a7e319947b426: Status 404 returned error can't find the container with id 2f915afce75dd29deb2e2fedd8150b04418ab12947377011bb0a7e319947b426 Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.352209 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjdp7\" (UniqueName: \"kubernetes.io/projected/b3ffcf02-5ead-4e06-b402-9b48c21f2d36-kube-api-access-rjdp7\") pod \"machine-config-controller-84d6567774-9cnqk\" (UID: \"b3ffcf02-5ead-4e06-b402-9b48c21f2d36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.371490 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mg2g2"] Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.374329 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v5rz\" (UniqueName: \"kubernetes.io/projected/907a3641-9861-4891-a145-a0d36cb413b3-kube-api-access-4v5rz\") pod \"service-ca-operator-777779d784-ln9d6\" (UID: \"907a3641-9861-4891-a145-a0d36cb413b3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.389777 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03aa4bc1-7712-418f-b56d-9686e85ba1d2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-clfc4\" (UID: \"03aa4bc1-7712-418f-b56d-9686e85ba1d2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.401099 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.413611 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5mww\" (UniqueName: \"kubernetes.io/projected/bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc-kube-api-access-b5mww\") pod \"cluster-image-registry-operator-dc59b4c8b-wrn8m\" (UID: \"bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.432368 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wrn8m\" (UID: \"bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.435609 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.454337 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.475069 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.482636 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-d4zht"] Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.497427 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.521026 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.545523 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pll29\" (UniqueName: \"kubernetes.io/projected/f4e1534c-f64b-448a-9b81-7c4192c089f3-kube-api-access-pll29\") pod \"router-default-5444994796-78ldn\" (UID: \"f4e1534c-f64b-448a-9b81-7c4192c089f3\") " pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.545563 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-config\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.545587 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s9cf\" (UniqueName: \"kubernetes.io/projected/84b8a568-0fdf-42f7-ba14-f917320d7505-kube-api-access-9s9cf\") pod \"cluster-samples-operator-665b6dd947-jbhj2\" (UID: \"84b8a568-0fdf-42f7-ba14-f917320d7505\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jbhj2" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.545612 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e64e7c1-1812-46e6-bb95-f4d54d0d98f2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7cqtt\" (UID: \"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.545638 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqvgv\" (UniqueName: \"kubernetes.io/projected/3e64e7c1-1812-46e6-bb95-f4d54d0d98f2-kube-api-access-lqvgv\") pod \"authentication-operator-69f744f599-7cqtt\" (UID: \"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.545735 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/191ec755-6e3c-4fba-8b70-b81d3e414b17-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ws7cz\" (UID: \"191ec755-6e3c-4fba-8b70-b81d3e414b17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.545760 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44ckr\" (UniqueName: \"kubernetes.io/projected/63db2c71-f81f-4ae8-90b0-0d45e0593119-kube-api-access-44ckr\") pod \"migrator-59844c95c7-dtv2h\" (UID: \"63db2c71-f81f-4ae8-90b0-0d45e0593119\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtv2h" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.545786 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c406d38a-d0b9-4f55-9883-78805191b8b3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-k27ll\" (UID: \"c406d38a-d0b9-4f55-9883-78805191b8b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.546053 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhhvq\" (UniqueName: \"kubernetes.io/projected/7c19d2ab-a061-40f1-99dd-d200cc62ba4d-kube-api-access-qhhvq\") pod \"kube-storage-version-migrator-operator-b67b599dd-n2zjt\" (UID: \"7c19d2ab-a061-40f1-99dd-d200cc62ba4d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.546474 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c19d2ab-a061-40f1-99dd-d200cc62ba4d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-n2zjt\" (UID: \"7c19d2ab-a061-40f1-99dd-d200cc62ba4d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.546597 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/470110ba-97b8-4d8f-a8da-0df16cd7abed-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.546635 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/470110ba-97b8-4d8f-a8da-0df16cd7abed-registry-certificates\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.546660 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f4e1534c-f64b-448a-9b81-7c4192c089f3-default-certificate\") pod \"router-default-5444994796-78ldn\" (UID: \"f4e1534c-f64b-448a-9b81-7c4192c089f3\") " pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.547997 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/470110ba-97b8-4d8f-a8da-0df16cd7abed-trusted-ca\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.548038 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/191ec755-6e3c-4fba-8b70-b81d3e414b17-srv-cert\") pod \"olm-operator-6b444d44fb-ws7cz\" (UID: \"191ec755-6e3c-4fba-8b70-b81d3e414b17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.548210 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-serving-cert\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.548259 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25214ad5-3dea-44fe-8dfc-75b877582f7e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-knn48\" (UID: \"25214ad5-3dea-44fe-8dfc-75b877582f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.548320 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-etcd-ca\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.548348 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/84b8a568-0fdf-42f7-ba14-f917320d7505-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jbhj2\" (UID: \"84b8a568-0fdf-42f7-ba14-f917320d7505\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jbhj2" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.548446 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-registry-tls\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.548479 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pj2z\" (UniqueName: \"kubernetes.io/projected/c35d61c3-defe-489a-96f3-c649240e6f9f-kube-api-access-7pj2z\") pod \"machine-config-operator-74547568cd-b4zf9\" (UID: \"c35d61c3-defe-489a-96f3-c649240e6f9f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.548529 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25214ad5-3dea-44fe-8dfc-75b877582f7e-trusted-ca\") pod \"ingress-operator-5b745b69d9-knn48\" (UID: \"25214ad5-3dea-44fe-8dfc-75b877582f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.548571 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4e1534c-f64b-448a-9b81-7c4192c089f3-metrics-certs\") pod \"router-default-5444994796-78ldn\" (UID: \"f4e1534c-f64b-448a-9b81-7c4192c089f3\") " pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.548607 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-bound-sa-token\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.548629 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt5ng\" (UniqueName: \"kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-kube-api-access-vt5ng\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.548651 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7793c57-4ff7-48a0-8913-46537e9ec353-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6p52g\" (UID: \"a7793c57-4ff7-48a0-8913-46537e9ec353\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.548676 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d989095d-7ce2-4dd7-ac9e-5c747e900a61-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q2bgt\" (UID: \"d989095d-7ce2-4dd7-ac9e-5c747e900a61\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.548910 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c35d61c3-defe-489a-96f3-c649240e6f9f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-b4zf9\" (UID: \"c35d61c3-defe-489a-96f3-c649240e6f9f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.549006 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25214ad5-3dea-44fe-8dfc-75b877582f7e-metrics-tls\") pod \"ingress-operator-5b745b69d9-knn48\" (UID: \"25214ad5-3dea-44fe-8dfc-75b877582f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.549068 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb6wr\" (UniqueName: \"kubernetes.io/projected/25214ad5-3dea-44fe-8dfc-75b877582f7e-kube-api-access-xb6wr\") pod \"ingress-operator-5b745b69d9-knn48\" (UID: \"25214ad5-3dea-44fe-8dfc-75b877582f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.549100 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s8mv\" (UniqueName: \"kubernetes.io/projected/191ec755-6e3c-4fba-8b70-b81d3e414b17-kube-api-access-6s8mv\") pod \"olm-operator-6b444d44fb-ws7cz\" (UID: \"191ec755-6e3c-4fba-8b70-b81d3e414b17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.549488 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c19d2ab-a061-40f1-99dd-d200cc62ba4d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-n2zjt\" (UID: \"7c19d2ab-a061-40f1-99dd-d200cc62ba4d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.549516 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e64e7c1-1812-46e6-bb95-f4d54d0d98f2-service-ca-bundle\") pod \"authentication-operator-69f744f599-7cqtt\" (UID: \"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.549558 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/470110ba-97b8-4d8f-a8da-0df16cd7abed-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.549577 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7793c57-4ff7-48a0-8913-46537e9ec353-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6p52g\" (UID: \"a7793c57-4ff7-48a0-8913-46537e9ec353\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.549627 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a54deeb5-bea0-4f51-aa4e-07df30bbf228-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bkb4h\" (UID: \"a54deeb5-bea0-4f51-aa4e-07df30bbf228\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkb4h" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.549707 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.549747 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c406d38a-d0b9-4f55-9883-78805191b8b3-config\") pod \"kube-apiserver-operator-766d6c64bb-k27ll\" (UID: \"c406d38a-d0b9-4f55-9883-78805191b8b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.549771 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e64e7c1-1812-46e6-bb95-f4d54d0d98f2-config\") pod \"authentication-operator-69f744f599-7cqtt\" (UID: \"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.549887 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c406d38a-d0b9-4f55-9883-78805191b8b3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-k27ll\" (UID: \"c406d38a-d0b9-4f55-9883-78805191b8b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.549913 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7793c57-4ff7-48a0-8913-46537e9ec353-config\") pod \"kube-controller-manager-operator-78b949d7b-6p52g\" (UID: \"a7793c57-4ff7-48a0-8913-46537e9ec353\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g" Mar 18 12:13:50 crc kubenswrapper[4975]: E0318 12:13:50.550150 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:51.050134404 +0000 UTC m=+216.764534983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.550339 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c35d61c3-defe-489a-96f3-c649240e6f9f-images\") pod \"machine-config-operator-74547568cd-b4zf9\" (UID: \"c35d61c3-defe-489a-96f3-c649240e6f9f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.550450 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f4e1534c-f64b-448a-9b81-7c4192c089f3-stats-auth\") pod \"router-default-5444994796-78ldn\" (UID: \"f4e1534c-f64b-448a-9b81-7c4192c089f3\") " pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.550521 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e64e7c1-1812-46e6-bb95-f4d54d0d98f2-serving-cert\") pod \"authentication-operator-69f744f599-7cqtt\" (UID: \"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.550853 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7hq4\" (UniqueName: \"kubernetes.io/projected/a54deeb5-bea0-4f51-aa4e-07df30bbf228-kube-api-access-p7hq4\") pod \"control-plane-machine-set-operator-78cbb6b69f-bkb4h\" (UID: \"a54deeb5-bea0-4f51-aa4e-07df30bbf228\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkb4h" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.550914 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skl4k\" (UniqueName: \"kubernetes.io/projected/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-kube-api-access-skl4k\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.550952 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffcng\" (UniqueName: \"kubernetes.io/projected/d989095d-7ce2-4dd7-ac9e-5c747e900a61-kube-api-access-ffcng\") pod \"marketplace-operator-79b997595-q2bgt\" (UID: \"d989095d-7ce2-4dd7-ac9e-5c747e900a61\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.551336 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d989095d-7ce2-4dd7-ac9e-5c747e900a61-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q2bgt\" (UID: \"d989095d-7ce2-4dd7-ac9e-5c747e900a61\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.551413 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-etcd-client\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.551440 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-etcd-service-ca\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.551460 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9hzv\" (UniqueName: \"kubernetes.io/projected/bb5204e4-c110-485b-8627-807fdb7f4c27-kube-api-access-r9hzv\") pod \"auto-csr-approver-29563932-9m82r\" (UID: \"bb5204e4-c110-485b-8627-807fdb7f4c27\") " pod="openshift-infra/auto-csr-approver-29563932-9m82r" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.551482 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c35d61c3-defe-489a-96f3-c649240e6f9f-proxy-tls\") pod \"machine-config-operator-74547568cd-b4zf9\" (UID: \"c35d61c3-defe-489a-96f3-c649240e6f9f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.551521 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4e1534c-f64b-448a-9b81-7c4192c089f3-service-ca-bundle\") pod \"router-default-5444994796-78ldn\" (UID: \"f4e1534c-f64b-448a-9b81-7c4192c089f3\") " pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.567973 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qhjb2"] Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.591527 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc"] Mar 18 12:13:50 crc kubenswrapper[4975]: W0318 12:13:50.621140 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a468175_a610_4d65_8ca9_a22f91d8d3fc.slice/crio-f8de183795588b63678ccb086c0d92d8f5d9a635c6fc386528e2659ebc9d1b34 WatchSource:0}: Error finding container f8de183795588b63678ccb086c0d92d8f5d9a635c6fc386528e2659ebc9d1b34: Status 404 returned error can't find the container with id f8de183795588b63678ccb086c0d92d8f5d9a635c6fc386528e2659ebc9d1b34 Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.643183 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-m27h4"] Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.643207 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.654190 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.654391 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.654517 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-bound-sa-token\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.654539 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt5ng\" (UniqueName: \"kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-kube-api-access-vt5ng\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: E0318 12:13:50.654562 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:51.154540313 +0000 UTC m=+216.868940892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.654596 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7793c57-4ff7-48a0-8913-46537e9ec353-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6p52g\" (UID: \"a7793c57-4ff7-48a0-8913-46537e9ec353\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.654651 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d989095d-7ce2-4dd7-ac9e-5c747e900a61-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q2bgt\" (UID: \"d989095d-7ce2-4dd7-ac9e-5c747e900a61\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.654769 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6fea1263-b938-4c34-a279-6e5391b768bf-srv-cert\") pod \"catalog-operator-68c6474976-lxtks\" (UID: \"6fea1263-b938-4c34-a279-6e5391b768bf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.654847 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c35d61c3-defe-489a-96f3-c649240e6f9f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-b4zf9\" (UID: \"c35d61c3-defe-489a-96f3-c649240e6f9f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.654930 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f59058bc-4678-4c59-b93b-d9af75ff6a7a-apiservice-cert\") pod \"packageserver-d55dfcdfc-r26rl\" (UID: \"f59058bc-4678-4c59-b93b-d9af75ff6a7a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.654956 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bb035988-397f-4da3-bc49-d38089014453-registration-dir\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655013 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25214ad5-3dea-44fe-8dfc-75b877582f7e-metrics-tls\") pod \"ingress-operator-5b745b69d9-knn48\" (UID: \"25214ad5-3dea-44fe-8dfc-75b877582f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655034 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb6wr\" (UniqueName: \"kubernetes.io/projected/25214ad5-3dea-44fe-8dfc-75b877582f7e-kube-api-access-xb6wr\") pod \"ingress-operator-5b745b69d9-knn48\" (UID: \"25214ad5-3dea-44fe-8dfc-75b877582f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655077 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s8mv\" (UniqueName: \"kubernetes.io/projected/191ec755-6e3c-4fba-8b70-b81d3e414b17-kube-api-access-6s8mv\") pod \"olm-operator-6b444d44fb-ws7cz\" (UID: \"191ec755-6e3c-4fba-8b70-b81d3e414b17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655109 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f59058bc-4678-4c59-b93b-d9af75ff6a7a-tmpfs\") pod \"packageserver-d55dfcdfc-r26rl\" (UID: \"f59058bc-4678-4c59-b93b-d9af75ff6a7a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655151 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmfts\" (UniqueName: \"kubernetes.io/projected/f59058bc-4678-4c59-b93b-d9af75ff6a7a-kube-api-access-vmfts\") pod \"packageserver-d55dfcdfc-r26rl\" (UID: \"f59058bc-4678-4c59-b93b-d9af75ff6a7a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655183 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsblf\" (UniqueName: \"kubernetes.io/projected/4d06a6f7-77a4-437d-8e9e-f1e9b7252a42-kube-api-access-vsblf\") pod \"ingress-canary-lzvfs\" (UID: \"4d06a6f7-77a4-437d-8e9e-f1e9b7252a42\") " pod="openshift-ingress-canary/ingress-canary-lzvfs" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655205 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c19d2ab-a061-40f1-99dd-d200cc62ba4d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-n2zjt\" (UID: \"7c19d2ab-a061-40f1-99dd-d200cc62ba4d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655243 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e64e7c1-1812-46e6-bb95-f4d54d0d98f2-service-ca-bundle\") pod \"authentication-operator-69f744f599-7cqtt\" (UID: \"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655265 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6fz9\" (UniqueName: \"kubernetes.io/projected/6fea1263-b938-4c34-a279-6e5391b768bf-kube-api-access-r6fz9\") pod \"catalog-operator-68c6474976-lxtks\" (UID: \"6fea1263-b938-4c34-a279-6e5391b768bf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655290 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/470110ba-97b8-4d8f-a8da-0df16cd7abed-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655314 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-627kp\" (UniqueName: \"kubernetes.io/projected/bb035988-397f-4da3-bc49-d38089014453-kube-api-access-627kp\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655338 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/363092a3-1f52-4b92-8369-8aecab622c7e-signing-key\") pod \"service-ca-9c57cc56f-xth7t\" (UID: \"363092a3-1f52-4b92-8369-8aecab622c7e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xth7t" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655391 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7793c57-4ff7-48a0-8913-46537e9ec353-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6p52g\" (UID: \"a7793c57-4ff7-48a0-8913-46537e9ec353\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655420 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a54deeb5-bea0-4f51-aa4e-07df30bbf228-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bkb4h\" (UID: \"a54deeb5-bea0-4f51-aa4e-07df30bbf228\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkb4h" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655444 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5ac220a8-d5aa-4352-b2ce-e2d911aee948-node-bootstrap-token\") pod \"machine-config-server-ww7rx\" (UID: \"5ac220a8-d5aa-4352-b2ce-e2d911aee948\") " pod="openshift-machine-config-operator/machine-config-server-ww7rx" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655472 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655497 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c406d38a-d0b9-4f55-9883-78805191b8b3-config\") pod \"kube-apiserver-operator-766d6c64bb-k27ll\" (UID: \"c406d38a-d0b9-4f55-9883-78805191b8b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655518 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e64e7c1-1812-46e6-bb95-f4d54d0d98f2-config\") pod \"authentication-operator-69f744f599-7cqtt\" (UID: \"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655557 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c406d38a-d0b9-4f55-9883-78805191b8b3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-k27ll\" (UID: \"c406d38a-d0b9-4f55-9883-78805191b8b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655578 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7793c57-4ff7-48a0-8913-46537e9ec353-config\") pod \"kube-controller-manager-operator-78b949d7b-6p52g\" (UID: \"a7793c57-4ff7-48a0-8913-46537e9ec353\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655721 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f59058bc-4678-4c59-b93b-d9af75ff6a7a-webhook-cert\") pod \"packageserver-d55dfcdfc-r26rl\" (UID: \"f59058bc-4678-4c59-b93b-d9af75ff6a7a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655751 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c35d61c3-defe-489a-96f3-c649240e6f9f-images\") pod \"machine-config-operator-74547568cd-b4zf9\" (UID: \"c35d61c3-defe-489a-96f3-c649240e6f9f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655771 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f4e1534c-f64b-448a-9b81-7c4192c089f3-stats-auth\") pod \"router-default-5444994796-78ldn\" (UID: \"f4e1534c-f64b-448a-9b81-7c4192c089f3\") " pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655792 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bb035988-397f-4da3-bc49-d38089014453-socket-dir\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655826 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e64e7c1-1812-46e6-bb95-f4d54d0d98f2-serving-cert\") pod \"authentication-operator-69f744f599-7cqtt\" (UID: \"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655892 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7hq4\" (UniqueName: \"kubernetes.io/projected/a54deeb5-bea0-4f51-aa4e-07df30bbf228-kube-api-access-p7hq4\") pod \"control-plane-machine-set-operator-78cbb6b69f-bkb4h\" (UID: \"a54deeb5-bea0-4f51-aa4e-07df30bbf228\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkb4h" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655915 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skl4k\" (UniqueName: \"kubernetes.io/projected/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-kube-api-access-skl4k\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655939 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffcng\" (UniqueName: \"kubernetes.io/projected/d989095d-7ce2-4dd7-ac9e-5c747e900a61-kube-api-access-ffcng\") pod \"marketplace-operator-79b997595-q2bgt\" (UID: \"d989095d-7ce2-4dd7-ac9e-5c747e900a61\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.655987 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d989095d-7ce2-4dd7-ac9e-5c747e900a61-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q2bgt\" (UID: \"d989095d-7ce2-4dd7-ac9e-5c747e900a61\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656013 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79bb26ca-fc97-4b59-ab56-12637c684208-secret-volume\") pod \"collect-profiles-29563920-p22bn\" (UID: \"79bb26ca-fc97-4b59-ab56-12637c684208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656104 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-etcd-client\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656136 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-etcd-service-ca\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656159 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9hzv\" (UniqueName: \"kubernetes.io/projected/bb5204e4-c110-485b-8627-807fdb7f4c27-kube-api-access-r9hzv\") pod \"auto-csr-approver-29563932-9m82r\" (UID: \"bb5204e4-c110-485b-8627-807fdb7f4c27\") " pod="openshift-infra/auto-csr-approver-29563932-9m82r" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656182 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c35d61c3-defe-489a-96f3-c649240e6f9f-proxy-tls\") pod \"machine-config-operator-74547568cd-b4zf9\" (UID: \"c35d61c3-defe-489a-96f3-c649240e6f9f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656219 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4e1534c-f64b-448a-9b81-7c4192c089f3-service-ca-bundle\") pod \"router-default-5444994796-78ldn\" (UID: \"f4e1534c-f64b-448a-9b81-7c4192c089f3\") " pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656250 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/363092a3-1f52-4b92-8369-8aecab622c7e-signing-cabundle\") pod \"service-ca-9c57cc56f-xth7t\" (UID: \"363092a3-1f52-4b92-8369-8aecab622c7e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xth7t" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656291 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pll29\" (UniqueName: \"kubernetes.io/projected/f4e1534c-f64b-448a-9b81-7c4192c089f3-kube-api-access-pll29\") pod \"router-default-5444994796-78ldn\" (UID: \"f4e1534c-f64b-448a-9b81-7c4192c089f3\") " pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656331 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-config\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656354 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s9cf\" (UniqueName: \"kubernetes.io/projected/84b8a568-0fdf-42f7-ba14-f917320d7505-kube-api-access-9s9cf\") pod \"cluster-samples-operator-665b6dd947-jbhj2\" (UID: \"84b8a568-0fdf-42f7-ba14-f917320d7505\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jbhj2" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656380 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d06a6f7-77a4-437d-8e9e-f1e9b7252a42-cert\") pod \"ingress-canary-lzvfs\" (UID: \"4d06a6f7-77a4-437d-8e9e-f1e9b7252a42\") " pod="openshift-ingress-canary/ingress-canary-lzvfs" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656404 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e64e7c1-1812-46e6-bb95-f4d54d0d98f2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7cqtt\" (UID: \"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656429 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqvgv\" (UniqueName: \"kubernetes.io/projected/3e64e7c1-1812-46e6-bb95-f4d54d0d98f2-kube-api-access-lqvgv\") pod \"authentication-operator-69f744f599-7cqtt\" (UID: \"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656453 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa1ca458-2bd9-4722-9895-08f1744e3cfd-config-volume\") pod \"dns-default-nlw8k\" (UID: \"fa1ca458-2bd9-4722-9895-08f1744e3cfd\") " pod="openshift-dns/dns-default-nlw8k" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656478 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa1ca458-2bd9-4722-9895-08f1744e3cfd-metrics-tls\") pod \"dns-default-nlw8k\" (UID: \"fa1ca458-2bd9-4722-9895-08f1744e3cfd\") " pod="openshift-dns/dns-default-nlw8k" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656502 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5ac220a8-d5aa-4352-b2ce-e2d911aee948-certs\") pod \"machine-config-server-ww7rx\" (UID: \"5ac220a8-d5aa-4352-b2ce-e2d911aee948\") " pod="openshift-machine-config-operator/machine-config-server-ww7rx" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656540 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/191ec755-6e3c-4fba-8b70-b81d3e414b17-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ws7cz\" (UID: \"191ec755-6e3c-4fba-8b70-b81d3e414b17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656564 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bb035988-397f-4da3-bc49-d38089014453-mountpoint-dir\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656615 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c406d38a-d0b9-4f55-9883-78805191b8b3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-k27ll\" (UID: \"c406d38a-d0b9-4f55-9883-78805191b8b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656643 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44ckr\" (UniqueName: \"kubernetes.io/projected/63db2c71-f81f-4ae8-90b0-0d45e0593119-kube-api-access-44ckr\") pod \"migrator-59844c95c7-dtv2h\" (UID: \"63db2c71-f81f-4ae8-90b0-0d45e0593119\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtv2h" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656668 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gttns\" (UniqueName: \"kubernetes.io/projected/fa1ca458-2bd9-4722-9895-08f1744e3cfd-kube-api-access-gttns\") pod \"dns-default-nlw8k\" (UID: \"fa1ca458-2bd9-4722-9895-08f1744e3cfd\") " pod="openshift-dns/dns-default-nlw8k" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656854 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhhvq\" (UniqueName: \"kubernetes.io/projected/7c19d2ab-a061-40f1-99dd-d200cc62ba4d-kube-api-access-qhhvq\") pod \"kube-storage-version-migrator-operator-b67b599dd-n2zjt\" (UID: \"7c19d2ab-a061-40f1-99dd-d200cc62ba4d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656914 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdmsb\" (UniqueName: \"kubernetes.io/projected/5669ff9b-1b24-44d8-a86d-963170a76dee-kube-api-access-vdmsb\") pod \"multus-admission-controller-857f4d67dd-cxbbs\" (UID: \"5669ff9b-1b24-44d8-a86d-963170a76dee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cxbbs" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656940 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bb035988-397f-4da3-bc49-d38089014453-plugins-dir\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656982 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdpcd\" (UniqueName: \"kubernetes.io/projected/e7df357a-6928-4618-b6fb-6a99bb306668-kube-api-access-mdpcd\") pod \"package-server-manager-789f6589d5-6sclc\" (UID: \"e7df357a-6928-4618-b6fb-6a99bb306668\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657026 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c19d2ab-a061-40f1-99dd-d200cc62ba4d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-n2zjt\" (UID: \"7c19d2ab-a061-40f1-99dd-d200cc62ba4d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657052 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/470110ba-97b8-4d8f-a8da-0df16cd7abed-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657079 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6fea1263-b938-4c34-a279-6e5391b768bf-profile-collector-cert\") pod \"catalog-operator-68c6474976-lxtks\" (UID: \"6fea1263-b938-4c34-a279-6e5391b768bf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657120 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/470110ba-97b8-4d8f-a8da-0df16cd7abed-registry-certificates\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657147 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f4e1534c-f64b-448a-9b81-7c4192c089f3-default-certificate\") pod \"router-default-5444994796-78ldn\" (UID: \"f4e1534c-f64b-448a-9b81-7c4192c089f3\") " pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657190 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5669ff9b-1b24-44d8-a86d-963170a76dee-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cxbbs\" (UID: \"5669ff9b-1b24-44d8-a86d-963170a76dee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cxbbs" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657219 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/470110ba-97b8-4d8f-a8da-0df16cd7abed-trusted-ca\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657241 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79bb26ca-fc97-4b59-ab56-12637c684208-config-volume\") pod \"collect-profiles-29563920-p22bn\" (UID: \"79bb26ca-fc97-4b59-ab56-12637c684208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657276 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/191ec755-6e3c-4fba-8b70-b81d3e414b17-srv-cert\") pod \"olm-operator-6b444d44fb-ws7cz\" (UID: \"191ec755-6e3c-4fba-8b70-b81d3e414b17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657297 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bb035988-397f-4da3-bc49-d38089014453-csi-data-dir\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657362 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25214ad5-3dea-44fe-8dfc-75b877582f7e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-knn48\" (UID: \"25214ad5-3dea-44fe-8dfc-75b877582f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657387 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-serving-cert\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657410 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-etcd-ca\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657435 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/84b8a568-0fdf-42f7-ba14-f917320d7505-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jbhj2\" (UID: \"84b8a568-0fdf-42f7-ba14-f917320d7505\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jbhj2" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657493 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzlnf\" (UniqueName: \"kubernetes.io/projected/79bb26ca-fc97-4b59-ab56-12637c684208-kube-api-access-rzlnf\") pod \"collect-profiles-29563920-p22bn\" (UID: \"79bb26ca-fc97-4b59-ab56-12637c684208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657519 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7df357a-6928-4618-b6fb-6a99bb306668-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6sclc\" (UID: \"e7df357a-6928-4618-b6fb-6a99bb306668\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657545 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-registry-tls\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657569 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pj2z\" (UniqueName: \"kubernetes.io/projected/c35d61c3-defe-489a-96f3-c649240e6f9f-kube-api-access-7pj2z\") pod \"machine-config-operator-74547568cd-b4zf9\" (UID: \"c35d61c3-defe-489a-96f3-c649240e6f9f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657595 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7mzx\" (UniqueName: \"kubernetes.io/projected/5ac220a8-d5aa-4352-b2ce-e2d911aee948-kube-api-access-j7mzx\") pod \"machine-config-server-ww7rx\" (UID: \"5ac220a8-d5aa-4352-b2ce-e2d911aee948\") " pod="openshift-machine-config-operator/machine-config-server-ww7rx" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657622 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25214ad5-3dea-44fe-8dfc-75b877582f7e-trusted-ca\") pod \"ingress-operator-5b745b69d9-knn48\" (UID: \"25214ad5-3dea-44fe-8dfc-75b877582f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657650 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djxdj\" (UniqueName: \"kubernetes.io/projected/363092a3-1f52-4b92-8369-8aecab622c7e-kube-api-access-djxdj\") pod \"service-ca-9c57cc56f-xth7t\" (UID: \"363092a3-1f52-4b92-8369-8aecab622c7e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xth7t" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.657678 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4e1534c-f64b-448a-9b81-7c4192c089f3-metrics-certs\") pod \"router-default-5444994796-78ldn\" (UID: \"f4e1534c-f64b-448a-9b81-7c4192c089f3\") " pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:50 crc kubenswrapper[4975]: E0318 12:13:50.658595 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:51.158581798 +0000 UTC m=+216.872982377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.659356 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c406d38a-d0b9-4f55-9883-78805191b8b3-config\") pod \"kube-apiserver-operator-766d6c64bb-k27ll\" (UID: \"c406d38a-d0b9-4f55-9883-78805191b8b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.659505 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c35d61c3-defe-489a-96f3-c649240e6f9f-images\") pod \"machine-config-operator-74547568cd-b4zf9\" (UID: \"c35d61c3-defe-489a-96f3-c649240e6f9f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.656541 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d989095d-7ce2-4dd7-ac9e-5c747e900a61-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q2bgt\" (UID: \"d989095d-7ce2-4dd7-ac9e-5c747e900a61\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.674170 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25214ad5-3dea-44fe-8dfc-75b877582f7e-metrics-tls\") pod \"ingress-operator-5b745b69d9-knn48\" (UID: \"25214ad5-3dea-44fe-8dfc-75b877582f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.674228 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d989095d-7ce2-4dd7-ac9e-5c747e900a61-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q2bgt\" (UID: \"d989095d-7ce2-4dd7-ac9e-5c747e900a61\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.674572 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7793c57-4ff7-48a0-8913-46537e9ec353-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6p52g\" (UID: \"a7793c57-4ff7-48a0-8913-46537e9ec353\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.675073 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-etcd-client\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.675085 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a54deeb5-bea0-4f51-aa4e-07df30bbf228-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bkb4h\" (UID: \"a54deeb5-bea0-4f51-aa4e-07df30bbf228\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkb4h" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.675107 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4e1534c-f64b-448a-9b81-7c4192c089f3-metrics-certs\") pod \"router-default-5444994796-78ldn\" (UID: \"f4e1534c-f64b-448a-9b81-7c4192c089f3\") " pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.676509 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/470110ba-97b8-4d8f-a8da-0df16cd7abed-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.678766 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-etcd-ca\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.679406 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-etcd-service-ca\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.680416 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c19d2ab-a061-40f1-99dd-d200cc62ba4d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-n2zjt\" (UID: \"7c19d2ab-a061-40f1-99dd-d200cc62ba4d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.682834 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c35d61c3-defe-489a-96f3-c649240e6f9f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-b4zf9\" (UID: \"c35d61c3-defe-489a-96f3-c649240e6f9f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.683608 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/84b8a568-0fdf-42f7-ba14-f917320d7505-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jbhj2\" (UID: \"84b8a568-0fdf-42f7-ba14-f917320d7505\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jbhj2" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.683996 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/470110ba-97b8-4d8f-a8da-0df16cd7abed-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: W0318 12:13:50.688029 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbf4b50a_de39_4b86_b0ca_883ba11d6e4b.slice/crio-ee4a693231b40cc71ac679eb53a60f261761d3ca479b94619bfd9fd90f465350 WatchSource:0}: Error finding container ee4a693231b40cc71ac679eb53a60f261761d3ca479b94619bfd9fd90f465350: Status 404 returned error can't find the container with id ee4a693231b40cc71ac679eb53a60f261761d3ca479b94619bfd9fd90f465350 Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.688080 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/470110ba-97b8-4d8f-a8da-0df16cd7abed-registry-certificates\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.715232 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/191ec755-6e3c-4fba-8b70-b81d3e414b17-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ws7cz\" (UID: \"191ec755-6e3c-4fba-8b70-b81d3e414b17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.715529 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f4e1534c-f64b-448a-9b81-7c4192c089f3-default-certificate\") pod \"router-default-5444994796-78ldn\" (UID: \"f4e1534c-f64b-448a-9b81-7c4192c089f3\") " pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.715987 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/191ec755-6e3c-4fba-8b70-b81d3e414b17-srv-cert\") pod \"olm-operator-6b444d44fb-ws7cz\" (UID: \"191ec755-6e3c-4fba-8b70-b81d3e414b17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.716018 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/470110ba-97b8-4d8f-a8da-0df16cd7abed-trusted-ca\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.716156 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f4e1534c-f64b-448a-9b81-7c4192c089f3-stats-auth\") pod \"router-default-5444994796-78ldn\" (UID: \"f4e1534c-f64b-448a-9b81-7c4192c089f3\") " pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.716236 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25214ad5-3dea-44fe-8dfc-75b877582f7e-trusted-ca\") pod \"ingress-operator-5b745b69d9-knn48\" (UID: \"25214ad5-3dea-44fe-8dfc-75b877582f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.716313 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-serving-cert\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.688608 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-registry-tls\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.716443 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e64e7c1-1812-46e6-bb95-f4d54d0d98f2-serving-cert\") pod \"authentication-operator-69f744f599-7cqtt\" (UID: \"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.716509 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e64e7c1-1812-46e6-bb95-f4d54d0d98f2-service-ca-bundle\") pod \"authentication-operator-69f744f599-7cqtt\" (UID: \"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.716856 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c19d2ab-a061-40f1-99dd-d200cc62ba4d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-n2zjt\" (UID: \"7c19d2ab-a061-40f1-99dd-d200cc62ba4d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.717580 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-config\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.717849 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e64e7c1-1812-46e6-bb95-f4d54d0d98f2-config\") pod \"authentication-operator-69f744f599-7cqtt\" (UID: \"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.718505 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e64e7c1-1812-46e6-bb95-f4d54d0d98f2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7cqtt\" (UID: \"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.718478 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh"] Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.722825 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7793c57-4ff7-48a0-8913-46537e9ec353-config\") pod \"kube-controller-manager-operator-78b949d7b-6p52g\" (UID: \"a7793c57-4ff7-48a0-8913-46537e9ec353\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.724141 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4e1534c-f64b-448a-9b81-7c4192c089f3-service-ca-bundle\") pod \"router-default-5444994796-78ldn\" (UID: \"f4e1534c-f64b-448a-9b81-7c4192c089f3\") " pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.760781 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c35d61c3-defe-489a-96f3-c649240e6f9f-proxy-tls\") pod \"machine-config-operator-74547568cd-b4zf9\" (UID: \"c35d61c3-defe-489a-96f3-c649240e6f9f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.761799 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt5ng\" (UniqueName: \"kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-kube-api-access-vt5ng\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.763050 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.763319 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bb035988-397f-4da3-bc49-d38089014453-mountpoint-dir\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.763383 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gttns\" (UniqueName: \"kubernetes.io/projected/fa1ca458-2bd9-4722-9895-08f1744e3cfd-kube-api-access-gttns\") pod \"dns-default-nlw8k\" (UID: \"fa1ca458-2bd9-4722-9895-08f1744e3cfd\") " pod="openshift-dns/dns-default-nlw8k" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.763416 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bb035988-397f-4da3-bc49-d38089014453-plugins-dir\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.763437 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdpcd\" (UniqueName: \"kubernetes.io/projected/e7df357a-6928-4618-b6fb-6a99bb306668-kube-api-access-mdpcd\") pod \"package-server-manager-789f6589d5-6sclc\" (UID: \"e7df357a-6928-4618-b6fb-6a99bb306668\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.763470 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdmsb\" (UniqueName: \"kubernetes.io/projected/5669ff9b-1b24-44d8-a86d-963170a76dee-kube-api-access-vdmsb\") pod \"multus-admission-controller-857f4d67dd-cxbbs\" (UID: \"5669ff9b-1b24-44d8-a86d-963170a76dee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cxbbs" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.763494 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6fea1263-b938-4c34-a279-6e5391b768bf-profile-collector-cert\") pod \"catalog-operator-68c6474976-lxtks\" (UID: \"6fea1263-b938-4c34-a279-6e5391b768bf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.763520 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5669ff9b-1b24-44d8-a86d-963170a76dee-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cxbbs\" (UID: \"5669ff9b-1b24-44d8-a86d-963170a76dee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cxbbs" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.763540 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79bb26ca-fc97-4b59-ab56-12637c684208-config-volume\") pod \"collect-profiles-29563920-p22bn\" (UID: \"79bb26ca-fc97-4b59-ab56-12637c684208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.763571 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bb035988-397f-4da3-bc49-d38089014453-csi-data-dir\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.763646 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzlnf\" (UniqueName: \"kubernetes.io/projected/79bb26ca-fc97-4b59-ab56-12637c684208-kube-api-access-rzlnf\") pod \"collect-profiles-29563920-p22bn\" (UID: \"79bb26ca-fc97-4b59-ab56-12637c684208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.763668 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7df357a-6928-4618-b6fb-6a99bb306668-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6sclc\" (UID: \"e7df357a-6928-4618-b6fb-6a99bb306668\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.763703 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7mzx\" (UniqueName: \"kubernetes.io/projected/5ac220a8-d5aa-4352-b2ce-e2d911aee948-kube-api-access-j7mzx\") pod \"machine-config-server-ww7rx\" (UID: \"5ac220a8-d5aa-4352-b2ce-e2d911aee948\") " pod="openshift-machine-config-operator/machine-config-server-ww7rx" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.763726 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djxdj\" (UniqueName: \"kubernetes.io/projected/363092a3-1f52-4b92-8369-8aecab622c7e-kube-api-access-djxdj\") pod \"service-ca-9c57cc56f-xth7t\" (UID: \"363092a3-1f52-4b92-8369-8aecab622c7e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xth7t" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.763752 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6fea1263-b938-4c34-a279-6e5391b768bf-srv-cert\") pod \"catalog-operator-68c6474976-lxtks\" (UID: \"6fea1263-b938-4c34-a279-6e5391b768bf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.764853 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bb035988-397f-4da3-bc49-d38089014453-plugins-dir\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: E0318 12:13:50.765096 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:51.265077352 +0000 UTC m=+216.979477931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.765140 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bb035988-397f-4da3-bc49-d38089014453-mountpoint-dir\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.766173 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-bound-sa-token\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.768810 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6fea1263-b938-4c34-a279-6e5391b768bf-profile-collector-cert\") pod \"catalog-operator-68c6474976-lxtks\" (UID: \"6fea1263-b938-4c34-a279-6e5391b768bf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.768775 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bb035988-397f-4da3-bc49-d38089014453-csi-data-dir\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769190 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f59058bc-4678-4c59-b93b-d9af75ff6a7a-apiservice-cert\") pod \"packageserver-d55dfcdfc-r26rl\" (UID: \"f59058bc-4678-4c59-b93b-d9af75ff6a7a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769225 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bb035988-397f-4da3-bc49-d38089014453-registration-dir\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769272 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f59058bc-4678-4c59-b93b-d9af75ff6a7a-tmpfs\") pod \"packageserver-d55dfcdfc-r26rl\" (UID: \"f59058bc-4678-4c59-b93b-d9af75ff6a7a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769303 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmfts\" (UniqueName: \"kubernetes.io/projected/f59058bc-4678-4c59-b93b-d9af75ff6a7a-kube-api-access-vmfts\") pod \"packageserver-d55dfcdfc-r26rl\" (UID: \"f59058bc-4678-4c59-b93b-d9af75ff6a7a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769330 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsblf\" (UniqueName: \"kubernetes.io/projected/4d06a6f7-77a4-437d-8e9e-f1e9b7252a42-kube-api-access-vsblf\") pod \"ingress-canary-lzvfs\" (UID: \"4d06a6f7-77a4-437d-8e9e-f1e9b7252a42\") " pod="openshift-ingress-canary/ingress-canary-lzvfs" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769360 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6fz9\" (UniqueName: \"kubernetes.io/projected/6fea1263-b938-4c34-a279-6e5391b768bf-kube-api-access-r6fz9\") pod \"catalog-operator-68c6474976-lxtks\" (UID: \"6fea1263-b938-4c34-a279-6e5391b768bf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769387 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-627kp\" (UniqueName: \"kubernetes.io/projected/bb035988-397f-4da3-bc49-d38089014453-kube-api-access-627kp\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769408 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/363092a3-1f52-4b92-8369-8aecab622c7e-signing-key\") pod \"service-ca-9c57cc56f-xth7t\" (UID: \"363092a3-1f52-4b92-8369-8aecab622c7e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xth7t" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769454 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769480 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5ac220a8-d5aa-4352-b2ce-e2d911aee948-node-bootstrap-token\") pod \"machine-config-server-ww7rx\" (UID: \"5ac220a8-d5aa-4352-b2ce-e2d911aee948\") " pod="openshift-machine-config-operator/machine-config-server-ww7rx" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769521 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5669ff9b-1b24-44d8-a86d-963170a76dee-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cxbbs\" (UID: \"5669ff9b-1b24-44d8-a86d-963170a76dee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cxbbs" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769526 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f59058bc-4678-4c59-b93b-d9af75ff6a7a-webhook-cert\") pod \"packageserver-d55dfcdfc-r26rl\" (UID: \"f59058bc-4678-4c59-b93b-d9af75ff6a7a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769585 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bb035988-397f-4da3-bc49-d38089014453-socket-dir\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769646 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79bb26ca-fc97-4b59-ab56-12637c684208-secret-volume\") pod \"collect-profiles-29563920-p22bn\" (UID: \"79bb26ca-fc97-4b59-ab56-12637c684208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769694 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/363092a3-1f52-4b92-8369-8aecab622c7e-signing-cabundle\") pod \"service-ca-9c57cc56f-xth7t\" (UID: \"363092a3-1f52-4b92-8369-8aecab622c7e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xth7t" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769748 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d06a6f7-77a4-437d-8e9e-f1e9b7252a42-cert\") pod \"ingress-canary-lzvfs\" (UID: \"4d06a6f7-77a4-437d-8e9e-f1e9b7252a42\") " pod="openshift-ingress-canary/ingress-canary-lzvfs" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769780 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79bb26ca-fc97-4b59-ab56-12637c684208-config-volume\") pod \"collect-profiles-29563920-p22bn\" (UID: \"79bb26ca-fc97-4b59-ab56-12637c684208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769788 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa1ca458-2bd9-4722-9895-08f1744e3cfd-config-volume\") pod \"dns-default-nlw8k\" (UID: \"fa1ca458-2bd9-4722-9895-08f1744e3cfd\") " pod="openshift-dns/dns-default-nlw8k" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769813 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa1ca458-2bd9-4722-9895-08f1744e3cfd-metrics-tls\") pod \"dns-default-nlw8k\" (UID: \"fa1ca458-2bd9-4722-9895-08f1744e3cfd\") " pod="openshift-dns/dns-default-nlw8k" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.769841 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5ac220a8-d5aa-4352-b2ce-e2d911aee948-certs\") pod \"machine-config-server-ww7rx\" (UID: \"5ac220a8-d5aa-4352-b2ce-e2d911aee948\") " pod="openshift-machine-config-operator/machine-config-server-ww7rx" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.770448 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c406d38a-d0b9-4f55-9883-78805191b8b3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-k27ll\" (UID: \"c406d38a-d0b9-4f55-9883-78805191b8b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.770602 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/363092a3-1f52-4b92-8369-8aecab622c7e-signing-cabundle\") pod \"service-ca-9c57cc56f-xth7t\" (UID: \"363092a3-1f52-4b92-8369-8aecab622c7e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xth7t" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.770680 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bb035988-397f-4da3-bc49-d38089014453-socket-dir\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.773489 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7df357a-6928-4618-b6fb-6a99bb306668-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6sclc\" (UID: \"e7df357a-6928-4618-b6fb-6a99bb306668\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.773649 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bb035988-397f-4da3-bc49-d38089014453-registration-dir\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.774017 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f59058bc-4678-4c59-b93b-d9af75ff6a7a-tmpfs\") pod \"packageserver-d55dfcdfc-r26rl\" (UID: \"f59058bc-4678-4c59-b93b-d9af75ff6a7a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.775012 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79bb26ca-fc97-4b59-ab56-12637c684208-secret-volume\") pod \"collect-profiles-29563920-p22bn\" (UID: \"79bb26ca-fc97-4b59-ab56-12637c684208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.777050 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5ac220a8-d5aa-4352-b2ce-e2d911aee948-certs\") pod \"machine-config-server-ww7rx\" (UID: \"5ac220a8-d5aa-4352-b2ce-e2d911aee948\") " pod="openshift-machine-config-operator/machine-config-server-ww7rx" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.779914 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb6wr\" (UniqueName: \"kubernetes.io/projected/25214ad5-3dea-44fe-8dfc-75b877582f7e-kube-api-access-xb6wr\") pod \"ingress-operator-5b745b69d9-knn48\" (UID: \"25214ad5-3dea-44fe-8dfc-75b877582f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.780544 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f59058bc-4678-4c59-b93b-d9af75ff6a7a-apiservice-cert\") pod \"packageserver-d55dfcdfc-r26rl\" (UID: \"f59058bc-4678-4c59-b93b-d9af75ff6a7a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:13:50 crc kubenswrapper[4975]: E0318 12:13:50.781088 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:51.281070288 +0000 UTC m=+216.995470927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.782751 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f59058bc-4678-4c59-b93b-d9af75ff6a7a-webhook-cert\") pod \"packageserver-d55dfcdfc-r26rl\" (UID: \"f59058bc-4678-4c59-b93b-d9af75ff6a7a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.786082 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5ac220a8-d5aa-4352-b2ce-e2d911aee948-node-bootstrap-token\") pod \"machine-config-server-ww7rx\" (UID: \"5ac220a8-d5aa-4352-b2ce-e2d911aee948\") " pod="openshift-machine-config-operator/machine-config-server-ww7rx" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.786510 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d06a6f7-77a4-437d-8e9e-f1e9b7252a42-cert\") pod \"ingress-canary-lzvfs\" (UID: \"4d06a6f7-77a4-437d-8e9e-f1e9b7252a42\") " pod="openshift-ingress-canary/ingress-canary-lzvfs" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.786725 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25214ad5-3dea-44fe-8dfc-75b877582f7e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-knn48\" (UID: \"25214ad5-3dea-44fe-8dfc-75b877582f7e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.786844 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa1ca458-2bd9-4722-9895-08f1744e3cfd-config-volume\") pod \"dns-default-nlw8k\" (UID: \"fa1ca458-2bd9-4722-9895-08f1744e3cfd\") " pod="openshift-dns/dns-default-nlw8k" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.787209 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/363092a3-1f52-4b92-8369-8aecab622c7e-signing-key\") pod \"service-ca-9c57cc56f-xth7t\" (UID: \"363092a3-1f52-4b92-8369-8aecab622c7e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xth7t" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.788007 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa1ca458-2bd9-4722-9895-08f1744e3cfd-metrics-tls\") pod \"dns-default-nlw8k\" (UID: \"fa1ca458-2bd9-4722-9895-08f1744e3cfd\") " pod="openshift-dns/dns-default-nlw8k" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.789696 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6fea1263-b938-4c34-a279-6e5391b768bf-srv-cert\") pod \"catalog-operator-68c6474976-lxtks\" (UID: \"6fea1263-b938-4c34-a279-6e5391b768bf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.792700 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqvgv\" (UniqueName: \"kubernetes.io/projected/3e64e7c1-1812-46e6-bb95-f4d54d0d98f2-kube-api-access-lqvgv\") pod \"authentication-operator-69f744f599-7cqtt\" (UID: \"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.844321 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.862806 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s8mv\" (UniqueName: \"kubernetes.io/projected/191ec755-6e3c-4fba-8b70-b81d3e414b17-kube-api-access-6s8mv\") pod \"olm-operator-6b444d44fb-ws7cz\" (UID: \"191ec755-6e3c-4fba-8b70-b81d3e414b17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.864029 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9hzv\" (UniqueName: \"kubernetes.io/projected/bb5204e4-c110-485b-8627-807fdb7f4c27-kube-api-access-r9hzv\") pod \"auto-csr-approver-29563932-9m82r\" (UID: \"bb5204e4-c110-485b-8627-807fdb7f4c27\") " pod="openshift-infra/auto-csr-approver-29563932-9m82r" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.865280 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pj2z\" (UniqueName: \"kubernetes.io/projected/c35d61c3-defe-489a-96f3-c649240e6f9f-kube-api-access-7pj2z\") pod \"machine-config-operator-74547568cd-b4zf9\" (UID: \"c35d61c3-defe-489a-96f3-c649240e6f9f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.870398 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:50 crc kubenswrapper[4975]: E0318 12:13:50.871044 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:51.371022881 +0000 UTC m=+217.085423470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.876200 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563932-9m82r" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.878330 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5bf2w"] Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.881062 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffcng\" (UniqueName: \"kubernetes.io/projected/d989095d-7ce2-4dd7-ac9e-5c747e900a61-kube-api-access-ffcng\") pod \"marketplace-operator-79b997595-q2bgt\" (UID: \"d989095d-7ce2-4dd7-ac9e-5c747e900a61\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.891101 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7hq4\" (UniqueName: \"kubernetes.io/projected/a54deeb5-bea0-4f51-aa4e-07df30bbf228-kube-api-access-p7hq4\") pod \"control-plane-machine-set-operator-78cbb6b69f-bkb4h\" (UID: \"a54deeb5-bea0-4f51-aa4e-07df30bbf228\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkb4h" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.891315 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.912593 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skl4k\" (UniqueName: \"kubernetes.io/projected/5b28d367-94a7-4f45-ba2f-86110b4e6a2e-kube-api-access-skl4k\") pod \"etcd-operator-b45778765-dn59g\" (UID: \"5b28d367-94a7-4f45-ba2f-86110b4e6a2e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.913920 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk"] Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.914385 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.931123 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.936716 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7793c57-4ff7-48a0-8913-46537e9ec353-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6p52g\" (UID: \"a7793c57-4ff7-48a0-8913-46537e9ec353\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.942198 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh" event={"ID":"b2b5a18f-c7da-47d4-b3a3-2a3917e63c89","Type":"ContainerStarted","Data":"4a68f78284259091aa51b81f0c8582fcc673144f4c12d24542ba7420651e3f96"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.947038 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d4zht" event={"ID":"36e42dcf-4953-46b4-8459-e2e72e03895c","Type":"ContainerStarted","Data":"398546115ff367773c0f5d3553ce683ad2bb9966a9f2ff100021276bf6edf54e"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.947102 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d4zht" event={"ID":"36e42dcf-4953-46b4-8459-e2e72e03895c","Type":"ContainerStarted","Data":"71c1f8c3b31e8c470ac6423db63124e0adc59b7bd6f0d08ba50115af8b4ca798"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.948763 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-d4zht" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.950612 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.950659 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.951038 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" event={"ID":"eabde416-404d-4874-b690-53897068b5cd","Type":"ContainerStarted","Data":"a1f8b4faca317f0b264c061da40da611e81098a03495ed20a3e03c006e3ae07b"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.965922 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pll29\" (UniqueName: \"kubernetes.io/projected/f4e1534c-f64b-448a-9b81-7c4192c089f3-kube-api-access-pll29\") pod \"router-default-5444994796-78ldn\" (UID: \"f4e1534c-f64b-448a-9b81-7c4192c089f3\") " pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.966702 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m27h4" event={"ID":"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b","Type":"ContainerStarted","Data":"ee4a693231b40cc71ac679eb53a60f261761d3ca479b94619bfd9fd90f465350"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.969442 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c406d38a-d0b9-4f55-9883-78805191b8b3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-k27ll\" (UID: \"c406d38a-d0b9-4f55-9883-78805191b8b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.969551 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r" event={"ID":"14b26921-b7c6-4f20-86af-abb1e8eb339e","Type":"ContainerStarted","Data":"0137744b8e1c08f1f5148c2fa28e342096b773da9186580a42615a5bb7104828"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.969587 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r" event={"ID":"14b26921-b7c6-4f20-86af-abb1e8eb339e","Type":"ContainerStarted","Data":"737cd7913eefdd7fafebfd1e8bc4880b98400d99519bfbb95a322aabdab04520"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.971761 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" event={"ID":"ac556a39-897a-44fe-b537-bdaf85c3f437","Type":"ContainerStarted","Data":"20f646e3a520e0cb658ea89ed1721d9929eacf1e06f492bd0a93429bddb11ef2"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.971797 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" event={"ID":"ac556a39-897a-44fe-b537-bdaf85c3f437","Type":"ContainerStarted","Data":"72110e8a1ee3e19ff58e3dbe5fd242a608f2627369956ac78a56d3d523cd5155"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.971970 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:50 crc kubenswrapper[4975]: E0318 12:13:50.972303 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:51.472291608 +0000 UTC m=+217.186692187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.972311 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.973249 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" event={"ID":"5c822ed4-6611-4cef-8002-972e9782d403","Type":"ContainerStarted","Data":"9ec6ec8989ba1c47da0540b033117f1339a0d45fb29c489b13ef82e926556aa3"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.973280 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" event={"ID":"5c822ed4-6611-4cef-8002-972e9782d403","Type":"ContainerStarted","Data":"2f915afce75dd29deb2e2fedd8150b04418ab12947377011bb0a7e319947b426"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.973696 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.977474 4975 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ftkrg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.977517 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" podUID="ac556a39-897a-44fe-b537-bdaf85c3f437" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.977475 4975 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-j56l5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.977570 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" podUID="5c822ed4-6611-4cef-8002-972e9782d403" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.978196 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" event={"ID":"318ad1a7-abe7-4e8d-bf62-cec22711b081","Type":"ContainerStarted","Data":"7b7de06605e868e0f185dc5382918356fb0a85b97cac301870e6431c11df276b"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.978234 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" event={"ID":"318ad1a7-abe7-4e8d-bf62-cec22711b081","Type":"ContainerStarted","Data":"c2ac6c274b710e31bc663917c84dfb1ed88558d9ff8f2352108bd0b4588daed5"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.981190 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qhjb2" event={"ID":"1a468175-a610-4d65-8ca9-a22f91d8d3fc","Type":"ContainerStarted","Data":"f8de183795588b63678ccb086c0d92d8f5d9a635c6fc386528e2659ebc9d1b34"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.985177 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" event={"ID":"22d232b9-7867-4587-9c0b-d6adba1cd8bd","Type":"ContainerStarted","Data":"861d7ea4a1ec7c07a096cfbb4426a05937cc5f514b3db6713e79b294896c31b8"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.985221 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" event={"ID":"22d232b9-7867-4587-9c0b-d6adba1cd8bd","Type":"ContainerStarted","Data":"b60f3646916cf6359eb56edcabcc4544bc07c961f643b3805ee44b959bbed43d"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.986755 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" event={"ID":"14d667ef-8c80-42f5-b119-1bae87e39be7","Type":"ContainerStarted","Data":"23d23ca8beff4fc3bea616878d12c4925791e02ed9ab2c28e65b83c63e82c90c"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.991139 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44ckr\" (UniqueName: \"kubernetes.io/projected/63db2c71-f81f-4ae8-90b0-0d45e0593119-kube-api-access-44ckr\") pod \"migrator-59844c95c7-dtv2h\" (UID: \"63db2c71-f81f-4ae8-90b0-0d45e0593119\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtv2h" Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.995000 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-z69nv" event={"ID":"311fa18b-fde1-4390-9682-75c836813f88","Type":"ContainerStarted","Data":"3012d584f83eb6aafb3fef46800471cc0de67f79de71722efff466d66f5b5c5c"} Mar 18 12:13:50 crc kubenswrapper[4975]: I0318 12:13:50.995053 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-z69nv" event={"ID":"311fa18b-fde1-4390-9682-75c836813f88","Type":"ContainerStarted","Data":"6e9368fcebd49c4ca14fd16f23260321c87a809486ce39310f1efa5b61c5e119"} Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.006083 4975 generic.go:334] "Generic (PLEG): container finished" podID="1d1be24c-eb6f-4df4-812d-491ea940ee60" containerID="51eb4948e004036a89e84f77d50648b92ba9e15533ff974d1d8ea62c27502b1d" exitCode=0 Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.006708 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" event={"ID":"1d1be24c-eb6f-4df4-812d-491ea940ee60","Type":"ContainerDied","Data":"51eb4948e004036a89e84f77d50648b92ba9e15533ff974d1d8ea62c27502b1d"} Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.006738 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" event={"ID":"1d1be24c-eb6f-4df4-812d-491ea940ee60","Type":"ContainerStarted","Data":"07bd8f53a68b45669e555f3516e0b3b2efc36b84add4b84c66714474f13e39ef"} Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.007383 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4"] Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.011292 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s9cf\" (UniqueName: \"kubernetes.io/projected/84b8a568-0fdf-42f7-ba14-f917320d7505-kube-api-access-9s9cf\") pod \"cluster-samples-operator-665b6dd947-jbhj2\" (UID: \"84b8a568-0fdf-42f7-ba14-f917320d7505\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jbhj2" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.011653 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5bf2w" event={"ID":"865e4345-8c11-4402-b673-93658fe66ced","Type":"ContainerStarted","Data":"1b56d6c4ef3fcbaf46fb2959995e7350b6f8e8dac2347efa1e262bc7c75c4a47"} Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.030123 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhhvq\" (UniqueName: \"kubernetes.io/projected/7c19d2ab-a061-40f1-99dd-d200cc62ba4d-kube-api-access-qhhvq\") pod \"kube-storage-version-migrator-operator-b67b599dd-n2zjt\" (UID: \"7c19d2ab-a061-40f1-99dd-d200cc62ba4d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt" Mar 18 12:13:51 crc kubenswrapper[4975]: W0318 12:13:51.038304 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03aa4bc1_7712_418f_b56d_9686e85ba1d2.slice/crio-0f33a123a434458dd7029a7ae0ec48572946b16c4d88bd77d3b0fcbc5618d3ab WatchSource:0}: Error finding container 0f33a123a434458dd7029a7ae0ec48572946b16c4d88bd77d3b0fcbc5618d3ab: Status 404 returned error can't find the container with id 0f33a123a434458dd7029a7ae0ec48572946b16c4d88bd77d3b0fcbc5618d3ab Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.083439 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:51 crc kubenswrapper[4975]: E0318 12:13:51.084188 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:51.584173232 +0000 UTC m=+217.298573811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.087409 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.087983 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.099427 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.099896 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jbhj2" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.107665 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.123260 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.125598 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gttns\" (UniqueName: \"kubernetes.io/projected/fa1ca458-2bd9-4722-9895-08f1744e3cfd-kube-api-access-gttns\") pod \"dns-default-nlw8k\" (UID: \"fa1ca458-2bd9-4722-9895-08f1744e3cfd\") " pod="openshift-dns/dns-default-nlw8k" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.126074 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdpcd\" (UniqueName: \"kubernetes.io/projected/e7df357a-6928-4618-b6fb-6a99bb306668-kube-api-access-mdpcd\") pod \"package-server-manager-789f6589d5-6sclc\" (UID: \"e7df357a-6928-4618-b6fb-6a99bb306668\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.135126 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdmsb\" (UniqueName: \"kubernetes.io/projected/5669ff9b-1b24-44d8-a86d-963170a76dee-kube-api-access-vdmsb\") pod \"multus-admission-controller-857f4d67dd-cxbbs\" (UID: \"5669ff9b-1b24-44d8-a86d-963170a76dee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cxbbs" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.135388 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m"] Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.140605 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djxdj\" (UniqueName: \"kubernetes.io/projected/363092a3-1f52-4b92-8369-8aecab622c7e-kube-api-access-djxdj\") pod \"service-ca-9c57cc56f-xth7t\" (UID: \"363092a3-1f52-4b92-8369-8aecab622c7e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xth7t" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.143601 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkb4h" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.152300 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7mzx\" (UniqueName: \"kubernetes.io/projected/5ac220a8-d5aa-4352-b2ce-e2d911aee948-kube-api-access-j7mzx\") pod \"machine-config-server-ww7rx\" (UID: \"5ac220a8-d5aa-4352-b2ce-e2d911aee948\") " pod="openshift-machine-config-operator/machine-config-server-ww7rx" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.152558 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll" Mar 18 12:13:51 crc kubenswrapper[4975]: W0318 12:13:51.177124 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb814c1a_efa2_4653_8adf_c4d9f5c2e7dc.slice/crio-b967e5883c1ec6c8adbcf48676e9285dee5ba639abf823a62c991e592b3a6c7f WatchSource:0}: Error finding container b967e5883c1ec6c8adbcf48676e9285dee5ba639abf823a62c991e592b3a6c7f: Status 404 returned error can't find the container with id b967e5883c1ec6c8adbcf48676e9285dee5ba639abf823a62c991e592b3a6c7f Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.185699 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtv2h" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.186317 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6"] Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.186665 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:51 crc kubenswrapper[4975]: E0318 12:13:51.187088 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:51.687074481 +0000 UTC m=+217.401475060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.201957 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6fz9\" (UniqueName: \"kubernetes.io/projected/6fea1263-b938-4c34-a279-6e5391b768bf-kube-api-access-r6fz9\") pod \"catalog-operator-68c6474976-lxtks\" (UID: \"6fea1263-b938-4c34-a279-6e5391b768bf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.213179 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzlnf\" (UniqueName: \"kubernetes.io/projected/79bb26ca-fc97-4b59-ab56-12637c684208-kube-api-access-rzlnf\") pod \"collect-profiles-29563920-p22bn\" (UID: \"79bb26ca-fc97-4b59-ab56-12637c684208\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.224164 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.229954 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmfts\" (UniqueName: \"kubernetes.io/projected/f59058bc-4678-4c59-b93b-d9af75ff6a7a-kube-api-access-vmfts\") pod \"packageserver-d55dfcdfc-r26rl\" (UID: \"f59058bc-4678-4c59-b93b-d9af75ff6a7a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.238427 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsblf\" (UniqueName: \"kubernetes.io/projected/4d06a6f7-77a4-437d-8e9e-f1e9b7252a42-kube-api-access-vsblf\") pod \"ingress-canary-lzvfs\" (UID: \"4d06a6f7-77a4-437d-8e9e-f1e9b7252a42\") " pod="openshift-ingress-canary/ingress-canary-lzvfs" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.254620 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-627kp\" (UniqueName: \"kubernetes.io/projected/bb035988-397f-4da3-bc49-d38089014453-kube-api-access-627kp\") pod \"csi-hostpathplugin-v7w9z\" (UID: \"bb035988-397f-4da3-bc49-d38089014453\") " pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.266903 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.280296 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xth7t" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.289463 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-cxbbs" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.294438 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:51 crc kubenswrapper[4975]: E0318 12:13:51.294834 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:51.794814097 +0000 UTC m=+217.509214676 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.355206 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.355754 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.364127 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nlw8k" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.374445 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lzvfs" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.390280 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ww7rx" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.397412 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:51 crc kubenswrapper[4975]: E0318 12:13:51.400156 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:51.90012826 +0000 UTC m=+217.614528839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.419353 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q2bgt"] Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.499196 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:51 crc kubenswrapper[4975]: E0318 12:13:51.500144 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:52.000126663 +0000 UTC m=+217.714527242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.550798 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.600013 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz"] Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.600608 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:51 crc kubenswrapper[4975]: E0318 12:13:51.600963 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:52.100948298 +0000 UTC m=+217.815348877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.617574 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563932-9m82r"] Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.620153 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-knn48"] Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.710407 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:51 crc kubenswrapper[4975]: E0318 12:13:51.714143 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:52.214115046 +0000 UTC m=+217.928515625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.714310 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:51 crc kubenswrapper[4975]: E0318 12:13:51.714699 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:52.214689891 +0000 UTC m=+217.929090470 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:51 crc kubenswrapper[4975]: W0318 12:13:51.756828 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25214ad5_3dea_44fe_8dfc_75b877582f7e.slice/crio-8da02e809dd6cbc8f71e822d37e021f54e13ec3e6fca1c7d294d3f9c97d3e47a WatchSource:0}: Error finding container 8da02e809dd6cbc8f71e822d37e021f54e13ec3e6fca1c7d294d3f9c97d3e47a: Status 404 returned error can't find the container with id 8da02e809dd6cbc8f71e822d37e021f54e13ec3e6fca1c7d294d3f9c97d3e47a Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.779627 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7cqtt"] Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.795042 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.815078 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:51 crc kubenswrapper[4975]: E0318 12:13:51.815268 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:52.315215678 +0000 UTC m=+218.029616257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.816308 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:51 crc kubenswrapper[4975]: W0318 12:13:51.816513 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e64e7c1_1812_46e6_bb95_f4d54d0d98f2.slice/crio-281a529f24dc2908d5501da32f5e1a2e0e2402841daeb43f2eaa0b397f0f707e WatchSource:0}: Error finding container 281a529f24dc2908d5501da32f5e1a2e0e2402841daeb43f2eaa0b397f0f707e: Status 404 returned error can't find the container with id 281a529f24dc2908d5501da32f5e1a2e0e2402841daeb43f2eaa0b397f0f707e Mar 18 12:13:51 crc kubenswrapper[4975]: E0318 12:13:51.816603 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:52.316595334 +0000 UTC m=+218.030995913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.933343 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:51 crc kubenswrapper[4975]: E0318 12:13:51.933659 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:52.433628452 +0000 UTC m=+218.148029031 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:51 crc kubenswrapper[4975]: I0318 12:13:51.984115 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.009676 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jbhj2"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.039027 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:52 crc kubenswrapper[4975]: E0318 12:13:52.039489 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:52.539475809 +0000 UTC m=+218.253876388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.048676 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.055025 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkb4h"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.057498 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.086908 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" podStartSLOduration=168.086884453 podStartE2EDuration="2m48.086884453s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:52.085187029 +0000 UTC m=+217.799587618" watchObservedRunningTime="2026-03-18 12:13:52.086884453 +0000 UTC m=+217.801285032" Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.092593 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk" event={"ID":"b3ffcf02-5ead-4e06-b402-9b48c21f2d36","Type":"ContainerStarted","Data":"666e7975bf8a73703bd008c1ab3213f0af4ba6bd602268c0101b05c4e316eff7"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.096288 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-78ldn" event={"ID":"f4e1534c-f64b-448a-9b81-7c4192c089f3","Type":"ContainerStarted","Data":"7e1f4a8ffaaa9fba60418329191daf3ea8e9842e7a620b0dd0dac802fda3cd24"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.107838 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6" event={"ID":"907a3641-9861-4891-a145-a0d36cb413b3","Type":"ContainerStarted","Data":"283e684799d0c4fdc699d437436df4f4fe5f9ff7a1c4c1feaad55a193564944e"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.109167 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4" event={"ID":"03aa4bc1-7712-418f-b56d-9686e85ba1d2","Type":"ContainerStarted","Data":"0f33a123a434458dd7029a7ae0ec48572946b16c4d88bd77d3b0fcbc5618d3ab"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.110960 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5bf2w" event={"ID":"865e4345-8c11-4402-b673-93658fe66ced","Type":"ContainerStarted","Data":"1a046bd217f8e40e8770c300faac9e12e516f0c499780decd0e902244ff4a3f0"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.113270 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" event={"ID":"25214ad5-3dea-44fe-8dfc-75b877582f7e","Type":"ContainerStarted","Data":"8da02e809dd6cbc8f71e822d37e021f54e13ec3e6fca1c7d294d3f9c97d3e47a"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.114030 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" event={"ID":"bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc","Type":"ContainerStarted","Data":"b967e5883c1ec6c8adbcf48676e9285dee5ba639abf823a62c991e592b3a6c7f"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.116529 4975 generic.go:334] "Generic (PLEG): container finished" podID="eabde416-404d-4874-b690-53897068b5cd" containerID="0eb8ddbac265704a2c3ed045631a0e71467fba42e45464c68b0b2f8955375af4" exitCode=0 Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.116964 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" event={"ID":"eabde416-404d-4874-b690-53897068b5cd","Type":"ContainerDied","Data":"0eb8ddbac265704a2c3ed045631a0e71467fba42e45464c68b0b2f8955375af4"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.120559 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" event={"ID":"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2","Type":"ContainerStarted","Data":"281a529f24dc2908d5501da32f5e1a2e0e2402841daeb43f2eaa0b397f0f707e"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.127431 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh" event={"ID":"b2b5a18f-c7da-47d4-b3a3-2a3917e63c89","Type":"ContainerStarted","Data":"4a7023e880bfc5fa7168346998a95721d684376820232056a5578e8eaaa70ff9"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.134956 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" event={"ID":"22d232b9-7867-4587-9c0b-d6adba1cd8bd","Type":"ContainerStarted","Data":"88058a14df5e317ecb7220554f8c6162da322e436ffe577b0f1941125dfd9b59"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.137412 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" event={"ID":"191ec755-6e3c-4fba-8b70-b81d3e414b17","Type":"ContainerStarted","Data":"7beef1d453a5671ab8bf2df7a720806e22394742838bfc40deacaf0ca7c085a9"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.140447 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:52 crc kubenswrapper[4975]: E0318 12:13:52.141406 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:52.641376792 +0000 UTC m=+218.355777371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.141561 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qhjb2" event={"ID":"1a468175-a610-4d65-8ca9-a22f91d8d3fc","Type":"ContainerStarted","Data":"f43d48934aa6dce753d8f02778cbf369a06cad86a81e3effc4823b748cdb9b54"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.142946 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563932-9m82r" event={"ID":"bb5204e4-c110-485b-8627-807fdb7f4c27","Type":"ContainerStarted","Data":"96fd7eedf629658d4a8b223eb781a8a5a7df058e8e45c9131a6056284fa577c6"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.154134 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" event={"ID":"14d667ef-8c80-42f5-b119-1bae87e39be7","Type":"ContainerStarted","Data":"5b7e49473cceaec36d25015f5dcfbf3a8cf47274103d7196a3e4b9321efb90e9"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.154752 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.195569 4975 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-mg2g2 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.195619 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" podUID="14d667ef-8c80-42f5-b119-1bae87e39be7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.200946 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" event={"ID":"d989095d-7ce2-4dd7-ac9e-5c747e900a61","Type":"ContainerStarted","Data":"4d91673f2376af26ae7a909db29444498626cda1fe6e2147e7659906725c9f9a"} Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.251044 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:52 crc kubenswrapper[4975]: E0318 12:13:52.252858 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:52.752845115 +0000 UTC m=+218.467245694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.267003 4975 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-j56l5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.267025 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.267482 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.267072 4975 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ftkrg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.267541 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" podUID="ac556a39-897a-44fe-b537-bdaf85c3f437" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.267402 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" podUID="5c822ed4-6611-4cef-8002-972e9782d403" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.318705 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cxbbs"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.327302 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dn59g"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.351008 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.351811 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:52 crc kubenswrapper[4975]: E0318 12:13:52.351997 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:52.851978357 +0000 UTC m=+218.566378956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.352745 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:52 crc kubenswrapper[4975]: E0318 12:13:52.356535 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:52.856518105 +0000 UTC m=+218.570918714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.433214 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lzvfs"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.434724 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.437052 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.454510 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:52 crc kubenswrapper[4975]: E0318 12:13:52.457240 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:52.957218797 +0000 UTC m=+218.671619376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:52 crc kubenswrapper[4975]: W0318 12:13:52.464015 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc406d38a_d0b9_4f55_9883_78805191b8b3.slice/crio-0fb00f33ad5775095c4fbb4b3a644b408a13c87aca145a84c210df4c0b6b9184 WatchSource:0}: Error finding container 0fb00f33ad5775095c4fbb4b3a644b408a13c87aca145a84c210df4c0b6b9184: Status 404 returned error can't find the container with id 0fb00f33ad5775095c4fbb4b3a644b408a13c87aca145a84c210df4c0b6b9184 Mar 18 12:13:52 crc kubenswrapper[4975]: W0318 12:13:52.486591 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79bb26ca_fc97_4b59_ab56_12637c684208.slice/crio-d25991203151fa711ede48fda6284c3fc920b7370dad7e942258d28cf002800a WatchSource:0}: Error finding container d25991203151fa711ede48fda6284c3fc920b7370dad7e942258d28cf002800a: Status 404 returned error can't find the container with id d25991203151fa711ede48fda6284c3fc920b7370dad7e942258d28cf002800a Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.495756 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" podStartSLOduration=168.49572987 podStartE2EDuration="2m48.49572987s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:52.495399712 +0000 UTC m=+218.209800301" watchObservedRunningTime="2026-03-18 12:13:52.49572987 +0000 UTC m=+218.210130449" Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.551350 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xth7t"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.558407 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:52 crc kubenswrapper[4975]: E0318 12:13:52.559196 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:53.059182583 +0000 UTC m=+218.773583162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.643553 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-d4zht" podStartSLOduration=169.643534269 podStartE2EDuration="2m49.643534269s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:52.609995366 +0000 UTC m=+218.324395955" watchObservedRunningTime="2026-03-18 12:13:52.643534269 +0000 UTC m=+218.357934848" Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.660412 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:52 crc kubenswrapper[4975]: E0318 12:13:52.661046 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:53.161025995 +0000 UTC m=+218.875426574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.731099 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dtv2h"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.734523 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-v7w9z"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.761415 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:52 crc kubenswrapper[4975]: E0318 12:13:52.761757 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:53.261744438 +0000 UTC m=+218.976145017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:52 crc kubenswrapper[4975]: W0318 12:13:52.796179 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63db2c71_f81f_4ae8_90b0_0d45e0593119.slice/crio-3e9fd4e2bd43c076df97b49fcf1e87986040f303aaf69ab4689843e49ffc6b23 WatchSource:0}: Error finding container 3e9fd4e2bd43c076df97b49fcf1e87986040f303aaf69ab4689843e49ffc6b23: Status 404 returned error can't find the container with id 3e9fd4e2bd43c076df97b49fcf1e87986040f303aaf69ab4689843e49ffc6b23 Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.809458 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-z69nv" podStartSLOduration=169.80943932 podStartE2EDuration="2m49.80943932s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:52.770301101 +0000 UTC m=+218.484701700" watchObservedRunningTime="2026-03-18 12:13:52.80943932 +0000 UTC m=+218.523839899" Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.819093 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.820978 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks"] Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.823665 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nlw8k"] Mar 18 12:13:52 crc kubenswrapper[4975]: W0318 12:13:52.858304 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf59058bc_4678_4c59_b93b_d9af75ff6a7a.slice/crio-33d7c92895179097a2dccd6ef943e92dadaaecd784fb8be95ae2cc3511651b4a WatchSource:0}: Error finding container 33d7c92895179097a2dccd6ef943e92dadaaecd784fb8be95ae2cc3511651b4a: Status 404 returned error can't find the container with id 33d7c92895179097a2dccd6ef943e92dadaaecd784fb8be95ae2cc3511651b4a Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.862507 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:52 crc kubenswrapper[4975]: E0318 12:13:52.862650 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:53.362630425 +0000 UTC m=+219.077031004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.862993 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:52 crc kubenswrapper[4975]: E0318 12:13:52.863323 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:53.363312633 +0000 UTC m=+219.077713292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:52 crc kubenswrapper[4975]: W0318 12:13:52.869724 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa1ca458_2bd9_4722_9895_08f1744e3cfd.slice/crio-eb166dcec7246781ee4a45ec7108661eddeda6f8280bc5d0f570f640c2e15a1f WatchSource:0}: Error finding container eb166dcec7246781ee4a45ec7108661eddeda6f8280bc5d0f570f640c2e15a1f: Status 404 returned error can't find the container with id eb166dcec7246781ee4a45ec7108661eddeda6f8280bc5d0f570f640c2e15a1f Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.885128 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m2m4r" podStartSLOduration=169.88510774 podStartE2EDuration="2m49.88510774s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:52.882937614 +0000 UTC m=+218.597338193" watchObservedRunningTime="2026-03-18 12:13:52.88510774 +0000 UTC m=+218.599508319" Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.966975 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:52 crc kubenswrapper[4975]: E0318 12:13:52.967266 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:53.467233309 +0000 UTC m=+219.181633888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:52 crc kubenswrapper[4975]: I0318 12:13:52.967453 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:52 crc kubenswrapper[4975]: E0318 12:13:52.967839 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:53.467830174 +0000 UTC m=+219.182230753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.049065 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" podStartSLOduration=170.049033639 podStartE2EDuration="2m50.049033639s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:53.0471507 +0000 UTC m=+218.761551279" watchObservedRunningTime="2026-03-18 12:13:53.049033639 +0000 UTC m=+218.763434218" Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.049742 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-6hg9j" podStartSLOduration=169.049737557 podStartE2EDuration="2m49.049737557s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:53.005540167 +0000 UTC m=+218.719940746" watchObservedRunningTime="2026-03-18 12:13:53.049737557 +0000 UTC m=+218.764138136" Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.069147 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:53 crc kubenswrapper[4975]: E0318 12:13:53.069417 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:53.569399189 +0000 UTC m=+219.283799768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.091580 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lt2wh" podStartSLOduration=169.091553356 podStartE2EDuration="2m49.091553356s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:53.090496319 +0000 UTC m=+218.804896918" watchObservedRunningTime="2026-03-18 12:13:53.091553356 +0000 UTC m=+218.805953935" Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.170540 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:53 crc kubenswrapper[4975]: E0318 12:13:53.170921 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:53.670904543 +0000 UTC m=+219.385305122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.272734 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:53 crc kubenswrapper[4975]: E0318 12:13:53.272977 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:53.77294761 +0000 UTC m=+219.487348199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.278825 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc" event={"ID":"e7df357a-6928-4618-b6fb-6a99bb306668","Type":"ContainerStarted","Data":"43a442cb27f93655f0a04b861b1b959ea354fb649278cfe573c1662a43238367"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.290433 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ww7rx" event={"ID":"5ac220a8-d5aa-4352-b2ce-e2d911aee948","Type":"ContainerStarted","Data":"d847c48479a734ac54effd694a5e342df303f18bbdd08a2a6761f1c579b9ccb5"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.292282 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6" event={"ID":"907a3641-9861-4891-a145-a0d36cb413b3","Type":"ContainerStarted","Data":"b49e4e4f5f71fdac56db4a4c5e5d58f104102b30a1cec750d377d8c4876eba15"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.293597 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lzvfs" event={"ID":"4d06a6f7-77a4-437d-8e9e-f1e9b7252a42","Type":"ContainerStarted","Data":"a16da5ad6de7085498ff0b427df9f36b4303b1bd1c92fd649c120a569e3e8828"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.294978 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" event={"ID":"f59058bc-4678-4c59-b93b-d9af75ff6a7a","Type":"ContainerStarted","Data":"33d7c92895179097a2dccd6ef943e92dadaaecd784fb8be95ae2cc3511651b4a"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.296716 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkb4h" event={"ID":"a54deeb5-bea0-4f51-aa4e-07df30bbf228","Type":"ContainerStarted","Data":"93ff3398ceaa2c36d23490aeaf83e5a0e98a90e2e94c822e47899533279c08ee"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.297646 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" event={"ID":"5b28d367-94a7-4f45-ba2f-86110b4e6a2e","Type":"ContainerStarted","Data":"401a16969daf96068a77d0f9db473f0b4b68523c0aaf9d8a0b018a14a674fb3d"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.299512 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" event={"ID":"bb035988-397f-4da3-bc49-d38089014453","Type":"ContainerStarted","Data":"77f71407904a4a543502dc7d28fc67db37f6b14d19507ddbc1f135c50133d13e"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.301088 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" event={"ID":"c35d61c3-defe-489a-96f3-c649240e6f9f","Type":"ContainerStarted","Data":"d7d22ce11cfdd9e5d2b69b74e5d4fe80abafc496c010d8f4403508a2abeffdab"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.302025 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cxbbs" event={"ID":"5669ff9b-1b24-44d8-a86d-963170a76dee","Type":"ContainerStarted","Data":"25a1b9f25df003523ecf1282c4eb5355d5e49374200f35f11e50e4be84b3ae19"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.303297 4975 generic.go:334] "Generic (PLEG): container finished" podID="bbf4b50a-de39-4b86-b0ca-883ba11d6e4b" containerID="30a5b2d5b24d2e11631768a43709a603c366727cc4437e1fafd063ec56915412" exitCode=0 Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.303342 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m27h4" event={"ID":"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b","Type":"ContainerDied","Data":"30a5b2d5b24d2e11631768a43709a603c366727cc4437e1fafd063ec56915412"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.304399 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk" event={"ID":"b3ffcf02-5ead-4e06-b402-9b48c21f2d36","Type":"ContainerStarted","Data":"0eb0dbb6189f43acc6d15d082618de6bdb50ea53a3a18dc50068972c85cea988"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.305190 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" event={"ID":"6fea1263-b938-4c34-a279-6e5391b768bf","Type":"ContainerStarted","Data":"06fe0d8d4c489e39e297e9837e10c5b849e7e68dac4ca418b51dd0c1d503fc52"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.306073 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xth7t" event={"ID":"363092a3-1f52-4b92-8369-8aecab622c7e","Type":"ContainerStarted","Data":"452ecfda58a6c98d77a75515b2359c71f2cbf906e619b1836582116c9b2fc371"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.306944 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll" event={"ID":"c406d38a-d0b9-4f55-9883-78805191b8b3","Type":"ContainerStarted","Data":"0fb00f33ad5775095c4fbb4b3a644b408a13c87aca145a84c210df4c0b6b9184"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.307741 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" event={"ID":"79bb26ca-fc97-4b59-ab56-12637c684208","Type":"ContainerStarted","Data":"d25991203151fa711ede48fda6284c3fc920b7370dad7e942258d28cf002800a"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.308628 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt" event={"ID":"7c19d2ab-a061-40f1-99dd-d200cc62ba4d","Type":"ContainerStarted","Data":"93b3b7a80ab35e5d639a00f3e25f9e003ed1745a58507bd4b2b86e6a83c38120"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.309507 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nlw8k" event={"ID":"fa1ca458-2bd9-4722-9895-08f1744e3cfd","Type":"ContainerStarted","Data":"eb166dcec7246781ee4a45ec7108661eddeda6f8280bc5d0f570f640c2e15a1f"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.317669 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtv2h" event={"ID":"63db2c71-f81f-4ae8-90b0-0d45e0593119","Type":"ContainerStarted","Data":"3e9fd4e2bd43c076df97b49fcf1e87986040f303aaf69ab4689843e49ffc6b23"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.321619 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" event={"ID":"1d1be24c-eb6f-4df4-812d-491ea940ee60","Type":"ContainerStarted","Data":"f32739c4e22ba15c3c1ff76eb941194165a93a1843cd559837fc7e81732a104e"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.321817 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.325037 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g" event={"ID":"a7793c57-4ff7-48a0-8913-46537e9ec353","Type":"ContainerStarted","Data":"3b007a5480dbf751dda25a8c5911ac4a4ea37593e57a45077df3d57792984bb9"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.330242 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" event={"ID":"318ad1a7-abe7-4e8d-bf62-cec22711b081","Type":"ContainerStarted","Data":"df72cb899458e8d32798348c1adaa657b34c39f6b45185436c3d50093fcef412"} Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.330333 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.330466 4975 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-mg2g2 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.330500 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" podUID="14d667ef-8c80-42f5-b119-1bae87e39be7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.330940 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.330997 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.334003 4975 patch_prober.go:28] interesting pod/console-operator-58897d9998-5bf2w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.334078 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5bf2w" podUID="865e4345-8c11-4402-b673-93658fe66ced" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.384198 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.393649 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xrd7l" podStartSLOduration=170.393626703 podStartE2EDuration="2m50.393626703s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:53.354962086 +0000 UTC m=+219.069362665" watchObservedRunningTime="2026-03-18 12:13:53.393626703 +0000 UTC m=+219.108027282" Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.399230 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" podStartSLOduration=170.399173327 podStartE2EDuration="2m50.399173327s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:53.341338441 +0000 UTC m=+219.055739030" watchObservedRunningTime="2026-03-18 12:13:53.399173327 +0000 UTC m=+219.113573906" Mar 18 12:13:53 crc kubenswrapper[4975]: E0318 12:13:53.399275 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:53.89925409 +0000 UTC m=+219.613654869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.405801 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-5bf2w" podStartSLOduration=170.40543034 podStartE2EDuration="2m50.40543034s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:53.383858689 +0000 UTC m=+219.098259268" watchObservedRunningTime="2026-03-18 12:13:53.40543034 +0000 UTC m=+219.119830929" Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.485645 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:53 crc kubenswrapper[4975]: E0318 12:13:53.485930 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:53.985899606 +0000 UTC m=+219.700300185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.486173 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:53 crc kubenswrapper[4975]: E0318 12:13:53.486596 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:53.986577444 +0000 UTC m=+219.700978023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.591570 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:53 crc kubenswrapper[4975]: E0318 12:13:53.591811 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:54.091778963 +0000 UTC m=+219.806179552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.592141 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:53 crc kubenswrapper[4975]: E0318 12:13:53.592626 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:54.092609775 +0000 UTC m=+219.807010354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.693164 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:53 crc kubenswrapper[4975]: E0318 12:13:53.693634 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:54.193614925 +0000 UTC m=+219.908015504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.794795 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:53 crc kubenswrapper[4975]: E0318 12:13:53.795256 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:54.295231422 +0000 UTC m=+220.009632041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.895660 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:53 crc kubenswrapper[4975]: E0318 12:13:53.895903 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:54.395858642 +0000 UTC m=+220.110259221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:53 crc kubenswrapper[4975]: I0318 12:13:53.996879 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:53 crc kubenswrapper[4975]: E0318 12:13:53.997275 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:54.497246942 +0000 UTC m=+220.211647521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.097663 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:54 crc kubenswrapper[4975]: E0318 12:13:54.098191 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:54.598176281 +0000 UTC m=+220.312576860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.226132 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:54 crc kubenswrapper[4975]: E0318 12:13:54.226620 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:54.726608666 +0000 UTC m=+220.441009245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.327608 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:54 crc kubenswrapper[4975]: E0318 12:13:54.328058 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:54.828042747 +0000 UTC m=+220.542443316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.334796 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" event={"ID":"25214ad5-3dea-44fe-8dfc-75b877582f7e","Type":"ContainerStarted","Data":"b9ed89dd3280d40a1442f126f6e111b8890e21aca96d4dfe12f3f94858bd3802"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.335852 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll" event={"ID":"c406d38a-d0b9-4f55-9883-78805191b8b3","Type":"ContainerStarted","Data":"8e39017d6377c2f886bdc789529ca6385da0cfe7dcf3aa42583311069c764a04"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.339021 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" event={"ID":"bb814c1a-efa2-4653-8adf-c4d9f5c2e7dc","Type":"ContainerStarted","Data":"808b3b387b0aa1fd57996efa7e8812ed78b74336d72cca450fbcbca284361e58"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.341131 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" event={"ID":"3e64e7c1-1812-46e6-bb95-f4d54d0d98f2","Type":"ContainerStarted","Data":"519b204447f255f4a958f5657c7a4c6388af32ddd96a7255e3222db41f31262d"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.342283 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cxbbs" event={"ID":"5669ff9b-1b24-44d8-a86d-963170a76dee","Type":"ContainerStarted","Data":"0fcf09cebb5b6510cd0d17a76ce300954aeafe4415356aeff2013b9dad72d910"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.343572 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xth7t" event={"ID":"363092a3-1f52-4b92-8369-8aecab622c7e","Type":"ContainerStarted","Data":"8721bec93073e3457e1f557de024be28517c40fa00db0b15884ace8a00bc8c73"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.344823 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ww7rx" event={"ID":"5ac220a8-d5aa-4352-b2ce-e2d911aee948","Type":"ContainerStarted","Data":"354a1f28abeb2a3d5247e8122cde56187dbf9fed7c8fda76187fce5c30738c39"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.345988 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" event={"ID":"79bb26ca-fc97-4b59-ab56-12637c684208","Type":"ContainerStarted","Data":"c71aa7bef889bc62ef542514ceaf515d7181e991e89e92e6d7a87de6dde539d6"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.347663 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4" event={"ID":"03aa4bc1-7712-418f-b56d-9686e85ba1d2","Type":"ContainerStarted","Data":"613b20331c8508a2d3d2de93da87ad58d55aeac11930d7726dd9d95ce760ada4"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.348953 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" event={"ID":"f59058bc-4678-4c59-b93b-d9af75ff6a7a","Type":"ContainerStarted","Data":"e5d6136e777fcd822b472e7bb2eb702807bd0c916290c3c67c21635a5f41076e"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.350419 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g" event={"ID":"a7793c57-4ff7-48a0-8913-46537e9ec353","Type":"ContainerStarted","Data":"a7a6de1f35da465c9fd8bcc56f929a1d235a1f05282be8997fd15f01e34c6feb"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.351780 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" event={"ID":"5b28d367-94a7-4f45-ba2f-86110b4e6a2e","Type":"ContainerStarted","Data":"d12f89cf7b05718d8dfbc868f9a4c349faa8e27da1a5261d5ff0149433b40f64"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.353066 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nlw8k" event={"ID":"fa1ca458-2bd9-4722-9895-08f1744e3cfd","Type":"ContainerStarted","Data":"6a9b85e0f9fa3466a9b2e3652f9d19a68e4572c514fb93dd48e8b13f222cdcdb"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.354203 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" event={"ID":"c35d61c3-defe-489a-96f3-c649240e6f9f","Type":"ContainerStarted","Data":"f93692a305f008a0bd04183e9ac1ca8d6bcb0ebff22b1bf598062c424ac38626"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.355391 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt" event={"ID":"7c19d2ab-a061-40f1-99dd-d200cc62ba4d","Type":"ContainerStarted","Data":"91e1ed82bf4d2cbbebbcecaff42233b53e5d3df59283eb752f011e54e0c56f1d"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.356371 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" event={"ID":"d989095d-7ce2-4dd7-ac9e-5c747e900a61","Type":"ContainerStarted","Data":"3f2f31a4621d02dcf84343d3adb32877b55252d630db0a176c7cdfad689d74a1"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.358611 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-78ldn" event={"ID":"f4e1534c-f64b-448a-9b81-7c4192c089f3","Type":"ContainerStarted","Data":"00c1afa4339674c9480be32a6a444075bc71a584774b37548ab446912a433f32"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.361407 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtv2h" event={"ID":"63db2c71-f81f-4ae8-90b0-0d45e0593119","Type":"ContainerStarted","Data":"5031c1dd1ea695f377fd4655bc70c3f38869355929f374df97de17de66df3491"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.363658 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" event={"ID":"eabde416-404d-4874-b690-53897068b5cd","Type":"ContainerStarted","Data":"603e7ac897171f51380baa3a370e4cc30caa95bb2f449514149744ddd537b0d0"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.365000 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jbhj2" event={"ID":"84b8a568-0fdf-42f7-ba14-f917320d7505","Type":"ContainerStarted","Data":"1be92d7f5979d7c29683b9d5e2bfa31b6afe1e3b5a7389f60eb4eb695b882e00"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.366232 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc" event={"ID":"e7df357a-6928-4618-b6fb-6a99bb306668","Type":"ContainerStarted","Data":"700c6c4ea1672a4c8ea4f4b02e852b5c00c5e3bebdc6a22a6fbad877fc1f5c51"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.367727 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qhjb2" event={"ID":"1a468175-a610-4d65-8ca9-a22f91d8d3fc","Type":"ContainerStarted","Data":"8e276678e6b854d3b2fb36b3b1f4af013a4273a1b5b55dd8a54476111f81cf14"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.369006 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lzvfs" event={"ID":"4d06a6f7-77a4-437d-8e9e-f1e9b7252a42","Type":"ContainerStarted","Data":"eb52aa0708dc17cd4d3aeb4fa26c7a9304dbd2ec14924fab935378ddb06a3fe2"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.370223 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkb4h" event={"ID":"a54deeb5-bea0-4f51-aa4e-07df30bbf228","Type":"ContainerStarted","Data":"6ba564770428674e2e4d01bdda870c74303b8acc7dfe784a2ef1bf2cb8b3dd12"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.371262 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" event={"ID":"6fea1263-b938-4c34-a279-6e5391b768bf","Type":"ContainerStarted","Data":"f5751c3281c4a039b3bbc3bf7405a9e3df8a2eb2dd47106c6bddf3d61f7db3ae"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.372980 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" event={"ID":"191ec755-6e3c-4fba-8b70-b81d3e414b17","Type":"ContainerStarted","Data":"b0b4cb9446e94a810fecccdf0eea5c4b19b98d9305151e47aeedfcc7a65893fa"} Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.375633 4975 patch_prober.go:28] interesting pod/console-operator-58897d9998-5bf2w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.375667 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5bf2w" podUID="865e4345-8c11-4402-b673-93658fe66ced" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.376707 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-78ldn" podStartSLOduration=170.376692064 podStartE2EDuration="2m50.376692064s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:54.376318854 +0000 UTC m=+220.090719443" watchObservedRunningTime="2026-03-18 12:13:54.376692064 +0000 UTC m=+220.091092643" Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.387747 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ln9d6" podStartSLOduration=170.387728052 podStartE2EDuration="2m50.387728052s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:54.387087505 +0000 UTC m=+220.101488094" watchObservedRunningTime="2026-03-18 12:13:54.387728052 +0000 UTC m=+220.102128651" Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.431419 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:54 crc kubenswrapper[4975]: E0318 12:13:54.436067 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:54.936048 +0000 UTC m=+220.650448579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.532466 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:54 crc kubenswrapper[4975]: E0318 12:13:54.532648 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.032622365 +0000 UTC m=+220.747022944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.532757 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:54 crc kubenswrapper[4975]: E0318 12:13:54.533071 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.033056456 +0000 UTC m=+220.747457065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.633718 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:54 crc kubenswrapper[4975]: E0318 12:13:54.633911 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.133894812 +0000 UTC m=+220.848295391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.744136 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:54 crc kubenswrapper[4975]: E0318 12:13:54.744798 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.24478391 +0000 UTC m=+220.959184489 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.845556 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:54 crc kubenswrapper[4975]: E0318 12:13:54.846405 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.346368545 +0000 UTC m=+221.060769134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.846537 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:54 crc kubenswrapper[4975]: E0318 12:13:54.847316 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.34730435 +0000 UTC m=+221.061704929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:54 crc kubenswrapper[4975]: I0318 12:13:54.948072 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:54 crc kubenswrapper[4975]: E0318 12:13:54.948457 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.448441403 +0000 UTC m=+221.162841982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.049184 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:55 crc kubenswrapper[4975]: E0318 12:13:55.049531 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.549518105 +0000 UTC m=+221.263918684 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.103120 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.103554 4975 patch_prober.go:28] interesting pod/router-default-5444994796-78ldn container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.103590 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-78ldn" podUID="f4e1534c-f64b-448a-9b81-7c4192c089f3" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.150372 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:55 crc kubenswrapper[4975]: E0318 12:13:55.150579 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.650530865 +0000 UTC m=+221.364931454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.150775 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:55 crc kubenswrapper[4975]: E0318 12:13:55.151135 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.651118931 +0000 UTC m=+221.365519510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.251498 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:55 crc kubenswrapper[4975]: E0318 12:13:55.251620 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.751595887 +0000 UTC m=+221.465996476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.252010 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:55 crc kubenswrapper[4975]: E0318 12:13:55.252335 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.752323976 +0000 UTC m=+221.466724555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.353624 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:55 crc kubenswrapper[4975]: E0318 12:13:55.353843 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.853812239 +0000 UTC m=+221.568212828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.353934 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:55 crc kubenswrapper[4975]: E0318 12:13:55.354561 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.854551868 +0000 UTC m=+221.568952447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.377487 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jbhj2" event={"ID":"84b8a568-0fdf-42f7-ba14-f917320d7505","Type":"ContainerStarted","Data":"d0917dd91656f50fe9c88d7165a3242111c6ef07527bc6b9037be34ccc931bd3"} Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.382131 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk" event={"ID":"b3ffcf02-5ead-4e06-b402-9b48c21f2d36","Type":"ContainerStarted","Data":"45569fbe8b439a04bdcb6a11fe278f0fdd28446798ab91a64020f73e21b3afb8"} Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.382177 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.383912 4975 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-r26rl container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.383953 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" podUID="f59058bc-4678-4c59-b93b-d9af75ff6a7a" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.384023 4975 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-q2bgt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.384508 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.384547 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.384704 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" podUID="d989095d-7ce2-4dd7-ac9e-5c747e900a61" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.390188 4975 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-lxtks container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.390243 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" podUID="6fea1263-b938-4c34-a279-6e5391b768bf" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.400535 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lzvfs" podStartSLOduration=7.400518016 podStartE2EDuration="7.400518016s" podCreationTimestamp="2026-03-18 12:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.398425841 +0000 UTC m=+221.112826420" watchObservedRunningTime="2026-03-18 12:13:55.400518016 +0000 UTC m=+221.114918595" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.421199 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k27ll" podStartSLOduration=171.421183914 podStartE2EDuration="2m51.421183914s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.419768007 +0000 UTC m=+221.134168596" watchObservedRunningTime="2026-03-18 12:13:55.421183914 +0000 UTC m=+221.135584493" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.466826 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:55 crc kubenswrapper[4975]: E0318 12:13:55.467005 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.966974376 +0000 UTC m=+221.681374975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.469170 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.469721 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" podStartSLOduration=171.469706917 podStartE2EDuration="2m51.469706917s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.469131552 +0000 UTC m=+221.183532131" watchObservedRunningTime="2026-03-18 12:13:55.469706917 +0000 UTC m=+221.184107506" Mar 18 12:13:55 crc kubenswrapper[4975]: E0318 12:13:55.472236 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:55.972221323 +0000 UTC m=+221.686621962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.493744 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-clfc4" podStartSLOduration=171.493727933 podStartE2EDuration="2m51.493727933s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.493258921 +0000 UTC m=+221.207659500" watchObservedRunningTime="2026-03-18 12:13:55.493727933 +0000 UTC m=+221.208128522" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.539308 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.539517 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.545451 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6p52g" podStartSLOduration=171.545431009 podStartE2EDuration="2m51.545431009s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.544209557 +0000 UTC m=+221.258610136" watchObservedRunningTime="2026-03-18 12:13:55.545431009 +0000 UTC m=+221.259831588" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.545575 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7cqtt" podStartSLOduration=172.545570173 podStartE2EDuration="2m52.545570173s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.521353732 +0000 UTC m=+221.235754311" watchObservedRunningTime="2026-03-18 12:13:55.545570173 +0000 UTC m=+221.259970752" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.562680 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-ww7rx" podStartSLOduration=7.5626438369999995 podStartE2EDuration="7.562643837s" podCreationTimestamp="2026-03-18 12:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.561554029 +0000 UTC m=+221.275954618" watchObservedRunningTime="2026-03-18 12:13:55.562643837 +0000 UTC m=+221.277044416" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.580072 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:55 crc kubenswrapper[4975]: E0318 12:13:55.580385 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:56.080349979 +0000 UTC m=+221.794750558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.580573 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:55 crc kubenswrapper[4975]: E0318 12:13:55.581214 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:56.081202491 +0000 UTC m=+221.795603140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.591443 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" podStartSLOduration=171.591423547 podStartE2EDuration="2m51.591423547s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.590992516 +0000 UTC m=+221.305393115" watchObservedRunningTime="2026-03-18 12:13:55.591423547 +0000 UTC m=+221.305824126" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.636905 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" podStartSLOduration=171.636883781 podStartE2EDuration="2m51.636883781s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.635400112 +0000 UTC m=+221.349800701" watchObservedRunningTime="2026-03-18 12:13:55.636883781 +0000 UTC m=+221.351284360" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.637346 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-qhjb2" podStartSLOduration=171.637339393 podStartE2EDuration="2m51.637339393s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.611193672 +0000 UTC m=+221.325594261" watchObservedRunningTime="2026-03-18 12:13:55.637339393 +0000 UTC m=+221.351739972" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.651614 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" podStartSLOduration=171.651596074 podStartE2EDuration="2m51.651596074s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.650136786 +0000 UTC m=+221.364537365" watchObservedRunningTime="2026-03-18 12:13:55.651596074 +0000 UTC m=+221.365996653" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.668649 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" podStartSLOduration=171.668631058 podStartE2EDuration="2m51.668631058s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.66756784 +0000 UTC m=+221.381968419" watchObservedRunningTime="2026-03-18 12:13:55.668631058 +0000 UTC m=+221.383031637" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.692198 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:55 crc kubenswrapper[4975]: E0318 12:13:55.692816 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:56.192795737 +0000 UTC m=+221.907196316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.796719 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:55 crc kubenswrapper[4975]: E0318 12:13:55.797304 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:56.297288438 +0000 UTC m=+222.011689027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.809718 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-xth7t" podStartSLOduration=171.809696761 podStartE2EDuration="2m51.809696761s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.739076562 +0000 UTC m=+221.453477141" watchObservedRunningTime="2026-03-18 12:13:55.809696761 +0000 UTC m=+221.524097340" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.829744 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bkb4h" podStartSLOduration=171.829724283 podStartE2EDuration="2m51.829724283s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.82733118 +0000 UTC m=+221.541731779" watchObservedRunningTime="2026-03-18 12:13:55.829724283 +0000 UTC m=+221.544124862" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.830759 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9cnqk" podStartSLOduration=171.830749139 podStartE2EDuration="2m51.830749139s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.810184324 +0000 UTC m=+221.524584913" watchObservedRunningTime="2026-03-18 12:13:55.830749139 +0000 UTC m=+221.545149719" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.887100 4975 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-l2jhn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.887155 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" podUID="1d1be24c-eb6f-4df4-812d-491ea940ee60" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.887175 4975 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-l2jhn container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.887238 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" podUID="1d1be24c-eb6f-4df4-812d-491ea940ee60" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.898200 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:55 crc kubenswrapper[4975]: E0318 12:13:55.898560 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:56.398544715 +0000 UTC m=+222.112945294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.900793 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-n2zjt" podStartSLOduration=171.900775093 podStartE2EDuration="2m51.900775093s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.898939305 +0000 UTC m=+221.613339904" watchObservedRunningTime="2026-03-18 12:13:55.900775093 +0000 UTC m=+221.615175662" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.902284 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wrn8m" podStartSLOduration=171.902272732 podStartE2EDuration="2m51.902272732s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.883180845 +0000 UTC m=+221.597581424" watchObservedRunningTime="2026-03-18 12:13:55.902272732 +0000 UTC m=+221.616673311" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.918134 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-dn59g" podStartSLOduration=171.918110585 podStartE2EDuration="2m51.918110585s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.91717747 +0000 UTC m=+221.631578049" watchObservedRunningTime="2026-03-18 12:13:55.918110585 +0000 UTC m=+221.632511174" Mar 18 12:13:55 crc kubenswrapper[4975]: I0318 12:13:55.952254 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" podStartSLOduration=172.952236773 podStartE2EDuration="2m52.952236773s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:55.95136342 +0000 UTC m=+221.665763999" watchObservedRunningTime="2026-03-18 12:13:55.952236773 +0000 UTC m=+221.666637352" Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.000635 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:56 crc kubenswrapper[4975]: E0318 12:13:56.001092 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:56.501077265 +0000 UTC m=+222.215477844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.076841 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.101372 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:56 crc kubenswrapper[4975]: E0318 12:13:56.102012 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:56.601982443 +0000 UTC m=+222.316383032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.105138 4975 patch_prober.go:28] interesting pod/router-default-5444994796-78ldn container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.105202 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-78ldn" podUID="f4e1534c-f64b-448a-9b81-7c4192c089f3" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.205418 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:56 crc kubenswrapper[4975]: E0318 12:13:56.205794 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:56.705778786 +0000 UTC m=+222.420179365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.306613 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:56 crc kubenswrapper[4975]: E0318 12:13:56.307116 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:56.807083304 +0000 UTC m=+222.521483893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.307208 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:56 crc kubenswrapper[4975]: E0318 12:13:56.307523 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:56.807511315 +0000 UTC m=+222.521911894 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.400817 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jbhj2" event={"ID":"84b8a568-0fdf-42f7-ba14-f917320d7505","Type":"ContainerStarted","Data":"40ef1c66dbf6fb25d35e255f5c53b736e2328bb47025b32f7b212d5270002943"} Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.408217 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:56 crc kubenswrapper[4975]: E0318 12:13:56.408790 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:56.908776652 +0000 UTC m=+222.623177221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.411223 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc" event={"ID":"e7df357a-6928-4618-b6fb-6a99bb306668","Type":"ContainerStarted","Data":"38ec21124d761649d5b5e40979c20b23ca297501f51a688bdc95a9675b5cc97b"} Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.411907 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc" Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.414513 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" event={"ID":"25214ad5-3dea-44fe-8dfc-75b877582f7e","Type":"ContainerStarted","Data":"bfecfdf6ff7813a84f430aa640be2da2c309d846ff331b610517079fff0854e2"} Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.426628 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jbhj2" podStartSLOduration=173.426612077 podStartE2EDuration="2m53.426612077s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:56.425025726 +0000 UTC m=+222.139426325" watchObservedRunningTime="2026-03-18 12:13:56.426612077 +0000 UTC m=+222.141012656" Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.428767 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" event={"ID":"c35d61c3-defe-489a-96f3-c649240e6f9f","Type":"ContainerStarted","Data":"47cba88bce4aacc172b16b9211ea9dbc72536d8f6199971096abd5f0b5bf3f1d"} Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.436610 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cxbbs" event={"ID":"5669ff9b-1b24-44d8-a86d-963170a76dee","Type":"ContainerStarted","Data":"c5a2afa67f20212730a0570d5d93fcf1729ea7c5a715dd77fb3b555efed2c117"} Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.440418 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m27h4" event={"ID":"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b","Type":"ContainerStarted","Data":"d5bdc179cfb93812d461142d04d0cc75da58ec8384a117634ce73965af46ba35"} Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.453626 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nlw8k" event={"ID":"fa1ca458-2bd9-4722-9895-08f1744e3cfd","Type":"ContainerStarted","Data":"9b2e040fb8fb809d3d80a654d12f0300d6fba46b10fa2c99ab6595d3b202652e"} Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.454199 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-nlw8k" Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.457792 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-knn48" podStartSLOduration=172.457769528 podStartE2EDuration="2m52.457769528s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:56.453706372 +0000 UTC m=+222.168106951" watchObservedRunningTime="2026-03-18 12:13:56.457769528 +0000 UTC m=+222.172170107" Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.461118 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtv2h" event={"ID":"63db2c71-f81f-4ae8-90b0-0d45e0593119","Type":"ContainerStarted","Data":"44233f87e95857408c9ac42ce5d40f1171a71c4123a33f770c4e2c699f5a002d"} Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.461856 4975 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-r26rl container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.461908 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" podUID="f59058bc-4678-4c59-b93b-d9af75ff6a7a" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.462175 4975 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-lxtks container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.462190 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" podUID="6fea1263-b938-4c34-a279-6e5391b768bf" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.462358 4975 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-q2bgt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.462371 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" podUID="d989095d-7ce2-4dd7-ac9e-5c747e900a61" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.510792 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:56 crc kubenswrapper[4975]: E0318 12:13:56.511257 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:57.011239451 +0000 UTC m=+222.725640030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.567672 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b4zf9" podStartSLOduration=172.5676453 podStartE2EDuration="2m52.5676453s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:56.505751318 +0000 UTC m=+222.220151907" watchObservedRunningTime="2026-03-18 12:13:56.5676453 +0000 UTC m=+222.282045879" Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.569062 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc" podStartSLOduration=172.569051886 podStartE2EDuration="2m52.569051886s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:56.485263324 +0000 UTC m=+222.199663903" watchObservedRunningTime="2026-03-18 12:13:56.569051886 +0000 UTC m=+222.283452475" Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.587553 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-cxbbs" podStartSLOduration=172.587532067 podStartE2EDuration="2m52.587532067s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:56.580960116 +0000 UTC m=+222.295360695" watchObservedRunningTime="2026-03-18 12:13:56.587532067 +0000 UTC m=+222.301932646" Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.647433 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:56 crc kubenswrapper[4975]: E0318 12:13:56.648944 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:57.148922986 +0000 UTC m=+222.863323565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.652579 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtv2h" podStartSLOduration=172.652567531 podStartE2EDuration="2m52.652567531s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:56.651796621 +0000 UTC m=+222.366197200" watchObservedRunningTime="2026-03-18 12:13:56.652567531 +0000 UTC m=+222.366968110" Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.690621 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nlw8k" podStartSLOduration=8.690603552 podStartE2EDuration="8.690603552s" podCreationTimestamp="2026-03-18 12:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:56.690361715 +0000 UTC m=+222.404762304" watchObservedRunningTime="2026-03-18 12:13:56.690603552 +0000 UTC m=+222.405004131" Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.786605 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:56 crc kubenswrapper[4975]: E0318 12:13:56.787100 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:57.287085754 +0000 UTC m=+223.001486333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:56 crc kubenswrapper[4975]: I0318 12:13:56.910113 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:56 crc kubenswrapper[4975]: E0318 12:13:56.910802 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:57.410785475 +0000 UTC m=+223.125186054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.017617 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:57 crc kubenswrapper[4975]: E0318 12:13:57.018303 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:57.518286275 +0000 UTC m=+223.232686854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.060304 4975 ???:1] "http: TLS handshake error from 192.168.126.11:35200: no serving certificate available for the kubelet" Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.137677 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:57 crc kubenswrapper[4975]: E0318 12:13:57.138104 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:57.638087475 +0000 UTC m=+223.352488054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.139722 4975 patch_prober.go:28] interesting pod/router-default-5444994796-78ldn container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.139777 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-78ldn" podUID="f4e1534c-f64b-448a-9b81-7c4192c089f3" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.255405 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:57 crc kubenswrapper[4975]: E0318 12:13:57.255738 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:57.755725688 +0000 UTC m=+223.470126267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.261801 4975 ???:1] "http: TLS handshake error from 192.168.126.11:35216: no serving certificate available for the kubelet" Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.356493 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:57 crc kubenswrapper[4975]: E0318 12:13:57.356940 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:57.856920504 +0000 UTC m=+223.571321083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.379521 4975 ???:1] "http: TLS handshake error from 192.168.126.11:35222: no serving certificate available for the kubelet" Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.458198 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:57 crc kubenswrapper[4975]: E0318 12:13:57.458555 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:57.95854378 +0000 UTC m=+223.672944359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.475951 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m27h4" event={"ID":"bbf4b50a-de39-4b86-b0ca-883ba11d6e4b","Type":"ContainerStarted","Data":"70622b2158a1852cf4c41339269493a3e886c9a86efd1013dda49ccb7094fe4d"} Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.479080 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" event={"ID":"bb035988-397f-4da3-bc49-d38089014453","Type":"ContainerStarted","Data":"4ef2cb6fb054015a258ba7e5337d2980f981e85e1de216170c0547fc801a7418"} Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.496942 4975 ???:1] "http: TLS handshake error from 192.168.126.11:35228: no serving certificate available for the kubelet" Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.502282 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-m27h4" podStartSLOduration=174.502267149 podStartE2EDuration="2m54.502267149s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:13:57.498644254 +0000 UTC m=+223.213044823" watchObservedRunningTime="2026-03-18 12:13:57.502267149 +0000 UTC m=+223.216667728" Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.558752 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:57 crc kubenswrapper[4975]: E0318 12:13:57.559782 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:58.059755396 +0000 UTC m=+223.774156065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.610495 4975 ???:1] "http: TLS handshake error from 192.168.126.11:35236: no serving certificate available for the kubelet" Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.660017 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:57 crc kubenswrapper[4975]: E0318 12:13:57.660417 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:58.160398637 +0000 UTC m=+223.874799286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.756085 4975 ???:1] "http: TLS handshake error from 192.168.126.11:35246: no serving certificate available for the kubelet" Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.761102 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:57 crc kubenswrapper[4975]: E0318 12:13:57.761312 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:58.261280154 +0000 UTC m=+223.975680753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.761503 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:57 crc kubenswrapper[4975]: E0318 12:13:57.762121 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:58.262102085 +0000 UTC m=+223.976502734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.862488 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:57 crc kubenswrapper[4975]: E0318 12:13:57.862689 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:58.362661384 +0000 UTC m=+224.077061963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.878280 4975 ???:1] "http: TLS handshake error from 192.168.126.11:35254: no serving certificate available for the kubelet" Mar 18 12:13:57 crc kubenswrapper[4975]: I0318 12:13:57.963776 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:57 crc kubenswrapper[4975]: E0318 12:13:57.964267 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:58.46425227 +0000 UTC m=+224.178652859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.064823 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:58 crc kubenswrapper[4975]: E0318 12:13:58.065208 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:58.565190618 +0000 UTC m=+224.279591197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.104340 4975 patch_prober.go:28] interesting pod/router-default-5444994796-78ldn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:58 crc kubenswrapper[4975]: [-]has-synced failed: reason withheld Mar 18 12:13:58 crc kubenswrapper[4975]: [+]process-running ok Mar 18 12:13:58 crc kubenswrapper[4975]: healthz check failed Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.104401 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-78ldn" podUID="f4e1534c-f64b-448a-9b81-7c4192c089f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.166886 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:58 crc kubenswrapper[4975]: E0318 12:13:58.167252 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:58.667235746 +0000 UTC m=+224.381636335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.171694 4975 ???:1] "http: TLS handshake error from 192.168.126.11:35266: no serving certificate available for the kubelet" Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.267482 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:58 crc kubenswrapper[4975]: E0318 12:13:58.267683 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:58.767652941 +0000 UTC m=+224.482053540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.377056 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:58 crc kubenswrapper[4975]: E0318 12:13:58.377406 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:58.877390148 +0000 UTC m=+224.591790717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.481529 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:58 crc kubenswrapper[4975]: E0318 12:13:58.481657 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:58.981633343 +0000 UTC m=+224.696033922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.481737 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:58 crc kubenswrapper[4975]: E0318 12:13:58.482260 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:58.982248389 +0000 UTC m=+224.696648958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.582527 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:58 crc kubenswrapper[4975]: E0318 12:13:58.582723 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:59.082682725 +0000 UTC m=+224.797083304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.582841 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:58 crc kubenswrapper[4975]: E0318 12:13:58.583509 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:59.083500586 +0000 UTC m=+224.797901165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.683409 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:58 crc kubenswrapper[4975]: E0318 12:13:58.683591 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:59.183563231 +0000 UTC m=+224.897963800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.683833 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:58 crc kubenswrapper[4975]: E0318 12:13:58.684181 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:59.184166156 +0000 UTC m=+224.898566805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.784567 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:58 crc kubenswrapper[4975]: E0318 12:13:58.784702 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:59.284669554 +0000 UTC m=+224.999070153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.785237 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:58 crc kubenswrapper[4975]: E0318 12:13:58.785689 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:59.28566711 +0000 UTC m=+225.000067729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.866098 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-l2jhn" Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.884370 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ftkrg"] Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.884617 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" podUID="ac556a39-897a-44fe-b537-bdaf85c3f437" containerName="controller-manager" containerID="cri-o://20f646e3a520e0cb658ea89ed1721d9929eacf1e06f492bd0a93429bddb11ef2" gracePeriod=30 Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.885914 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:58 crc kubenswrapper[4975]: E0318 12:13:58.886356 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:59.386330471 +0000 UTC m=+225.100731110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.941608 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.946602 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5"] Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.946880 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" podUID="5c822ed4-6611-4cef-8002-972e9782d403" containerName="route-controller-manager" containerID="cri-o://9ec6ec8989ba1c47da0540b033117f1339a0d45fb29c489b13ef82e926556aa3" gracePeriod=30 Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.955284 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.957385 4975 ???:1] "http: TLS handshake error from 192.168.126.11:35274: no serving certificate available for the kubelet" Mar 18 12:13:58 crc kubenswrapper[4975]: I0318 12:13:58.988807 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:58 crc kubenswrapper[4975]: E0318 12:13:58.989295 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:59.489281372 +0000 UTC m=+225.203681951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.089529 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:59 crc kubenswrapper[4975]: E0318 12:13:59.089880 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:59.589843711 +0000 UTC m=+225.304244290 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.110127 4975 patch_prober.go:28] interesting pod/router-default-5444994796-78ldn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:13:59 crc kubenswrapper[4975]: [-]has-synced failed: reason withheld Mar 18 12:13:59 crc kubenswrapper[4975]: [+]process-running ok Mar 18 12:13:59 crc kubenswrapper[4975]: healthz check failed Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.110189 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-78ldn" podUID="f4e1534c-f64b-448a-9b81-7c4192c089f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.190635 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:59 crc kubenswrapper[4975]: E0318 12:13:59.190966 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:59.690953604 +0000 UTC m=+225.405354183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.292019 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:59 crc kubenswrapper[4975]: E0318 12:13:59.292429 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:59.792409906 +0000 UTC m=+225.506810485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.393218 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:59 crc kubenswrapper[4975]: E0318 12:13:59.393655 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:13:59.893641052 +0000 UTC m=+225.608041631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.497432 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:59 crc kubenswrapper[4975]: E0318 12:13:59.497916 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:13:59.997896087 +0000 UTC m=+225.712296666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.574290 4975 generic.go:334] "Generic (PLEG): container finished" podID="5c822ed4-6611-4cef-8002-972e9782d403" containerID="9ec6ec8989ba1c47da0540b033117f1339a0d45fb29c489b13ef82e926556aa3" exitCode=0 Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.574587 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" event={"ID":"5c822ed4-6611-4cef-8002-972e9782d403","Type":"ContainerDied","Data":"9ec6ec8989ba1c47da0540b033117f1339a0d45fb29c489b13ef82e926556aa3"} Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.625519 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:59 crc kubenswrapper[4975]: E0318 12:13:59.625848 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:14:00.125835979 +0000 UTC m=+225.840236558 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.640046 4975 generic.go:334] "Generic (PLEG): container finished" podID="ac556a39-897a-44fe-b537-bdaf85c3f437" containerID="20f646e3a520e0cb658ea89ed1721d9929eacf1e06f492bd0a93429bddb11ef2" exitCode=0 Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.640102 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" event={"ID":"ac556a39-897a-44fe-b537-bdaf85c3f437","Type":"ContainerDied","Data":"20f646e3a520e0cb658ea89ed1721d9929eacf1e06f492bd0a93429bddb11ef2"} Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.650666 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.727336 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.727433 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c822ed4-6611-4cef-8002-972e9782d403-serving-cert\") pod \"5c822ed4-6611-4cef-8002-972e9782d403\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.727481 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grm7t\" (UniqueName: \"kubernetes.io/projected/5c822ed4-6611-4cef-8002-972e9782d403-kube-api-access-grm7t\") pod \"5c822ed4-6611-4cef-8002-972e9782d403\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.727512 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c822ed4-6611-4cef-8002-972e9782d403-config\") pod \"5c822ed4-6611-4cef-8002-972e9782d403\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.727559 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c822ed4-6611-4cef-8002-972e9782d403-client-ca\") pod \"5c822ed4-6611-4cef-8002-972e9782d403\" (UID: \"5c822ed4-6611-4cef-8002-972e9782d403\") " Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.728679 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c822ed4-6611-4cef-8002-972e9782d403-client-ca" (OuterVolumeSpecName: "client-ca") pod "5c822ed4-6611-4cef-8002-972e9782d403" (UID: "5c822ed4-6611-4cef-8002-972e9782d403"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:59 crc kubenswrapper[4975]: E0318 12:13:59.728768 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:14:00.228751469 +0000 UTC m=+225.943152048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.733125 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c822ed4-6611-4cef-8002-972e9782d403-config" (OuterVolumeSpecName: "config") pod "5c822ed4-6611-4cef-8002-972e9782d403" (UID: "5c822ed4-6611-4cef-8002-972e9782d403"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.737627 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c822ed4-6611-4cef-8002-972e9782d403-kube-api-access-grm7t" (OuterVolumeSpecName: "kube-api-access-grm7t") pod "5c822ed4-6611-4cef-8002-972e9782d403" (UID: "5c822ed4-6611-4cef-8002-972e9782d403"). InnerVolumeSpecName "kube-api-access-grm7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.753438 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c822ed4-6611-4cef-8002-972e9782d403-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5c822ed4-6611-4cef-8002-972e9782d403" (UID: "5c822ed4-6611-4cef-8002-972e9782d403"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.831930 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.832075 4975 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c822ed4-6611-4cef-8002-972e9782d403-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.832091 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c822ed4-6611-4cef-8002-972e9782d403-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.832105 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grm7t\" (UniqueName: \"kubernetes.io/projected/5c822ed4-6611-4cef-8002-972e9782d403-kube-api-access-grm7t\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.832117 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c822ed4-6611-4cef-8002-972e9782d403-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:13:59 crc kubenswrapper[4975]: E0318 12:13:59.832405 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:14:00.332382778 +0000 UTC m=+226.046783357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.845069 4975 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ftkrg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.845132 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" podUID="ac556a39-897a-44fe-b537-bdaf85c3f437" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.873423 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps"] Mar 18 12:13:59 crc kubenswrapper[4975]: E0318 12:13:59.873901 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c822ed4-6611-4cef-8002-972e9782d403" containerName="route-controller-manager" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.873914 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c822ed4-6611-4cef-8002-972e9782d403" containerName="route-controller-manager" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.874012 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c822ed4-6611-4cef-8002-972e9782d403" containerName="route-controller-manager" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.874368 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.898302 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps"] Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.933333 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:13:59 crc kubenswrapper[4975]: E0318 12:13:59.933584 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:14:00.433550392 +0000 UTC m=+226.147950971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.933660 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-config\") pod \"route-controller-manager-df6f7fc74-72mps\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.933686 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mkbv\" (UniqueName: \"kubernetes.io/projected/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-kube-api-access-7mkbv\") pod \"route-controller-manager-df6f7fc74-72mps\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.933742 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-client-ca\") pod \"route-controller-manager-df6f7fc74-72mps\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.933971 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-serving-cert\") pod \"route-controller-manager-df6f7fc74-72mps\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:13:59 crc kubenswrapper[4975]: I0318 12:13:59.934015 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:13:59 crc kubenswrapper[4975]: E0318 12:13:59.934391 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:14:00.434380704 +0000 UTC m=+226.148781293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.031445 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.031495 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.033918 4975 patch_prober.go:28] interesting pod/console-f9d7485db-z69nv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.033979 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-z69nv" podUID="311fa18b-fde1-4390-9682-75c836813f88" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.034598 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.034884 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-config\") pod \"route-controller-manager-df6f7fc74-72mps\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.034927 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mkbv\" (UniqueName: \"kubernetes.io/projected/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-kube-api-access-7mkbv\") pod \"route-controller-manager-df6f7fc74-72mps\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.034976 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-client-ca\") pod \"route-controller-manager-df6f7fc74-72mps\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.035052 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-serving-cert\") pod \"route-controller-manager-df6f7fc74-72mps\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.036792 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-client-ca\") pod \"route-controller-manager-df6f7fc74-72mps\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:14:00 crc kubenswrapper[4975]: E0318 12:14:00.036909 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:14:00.536892744 +0000 UTC m=+226.251293323 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.036943 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-config\") pod \"route-controller-manager-df6f7fc74-72mps\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.042162 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-serving-cert\") pod \"route-controller-manager-df6f7fc74-72mps\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.063708 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mkbv\" (UniqueName: \"kubernetes.io/projected/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-kube-api-access-7mkbv\") pod \"route-controller-manager-df6f7fc74-72mps\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.118438 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.119278 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.128004 4975 patch_prober.go:28] interesting pod/router-default-5444994796-78ldn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:14:00 crc kubenswrapper[4975]: [-]has-synced failed: reason withheld Mar 18 12:14:00 crc kubenswrapper[4975]: [+]process-running ok Mar 18 12:14:00 crc kubenswrapper[4975]: healthz check failed Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.128328 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-78ldn" podUID="f4e1534c-f64b-448a-9b81-7c4192c089f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.129335 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.129371 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.129430 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.129446 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.133887 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.136157 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:14:00 crc kubenswrapper[4975]: E0318 12:14:00.137216 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:14:00.637204896 +0000 UTC m=+226.351605475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.165897 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563934-6j5kv"] Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.166623 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563934-6j5kv" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.178293 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.183062 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563934-6j5kv"] Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.190175 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.238841 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.239320 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glfp7\" (UniqueName: \"kubernetes.io/projected/9d72ac7c-4ce8-4d23-a845-d359bca0544a-kube-api-access-glfp7\") pod \"auto-csr-approver-29563934-6j5kv\" (UID: \"9d72ac7c-4ce8-4d23-a845-d359bca0544a\") " pod="openshift-infra/auto-csr-approver-29563934-6j5kv" Mar 18 12:14:00 crc kubenswrapper[4975]: E0318 12:14:00.240538 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:14:00.740510246 +0000 UTC m=+226.454910825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.311666 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.311714 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.330989 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.340805 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hprxg"] Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.341170 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-proxy-ca-bundles\") pod \"ac556a39-897a-44fe-b537-bdaf85c3f437\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.341215 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac556a39-897a-44fe-b537-bdaf85c3f437-serving-cert\") pod \"ac556a39-897a-44fe-b537-bdaf85c3f437\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.341345 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-config\") pod \"ac556a39-897a-44fe-b537-bdaf85c3f437\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.341367 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlg74\" (UniqueName: \"kubernetes.io/projected/ac556a39-897a-44fe-b537-bdaf85c3f437-kube-api-access-nlg74\") pod \"ac556a39-897a-44fe-b537-bdaf85c3f437\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.341439 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-client-ca\") pod \"ac556a39-897a-44fe-b537-bdaf85c3f437\" (UID: \"ac556a39-897a-44fe-b537-bdaf85c3f437\") " Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.341629 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glfp7\" (UniqueName: \"kubernetes.io/projected/9d72ac7c-4ce8-4d23-a845-d359bca0544a-kube-api-access-glfp7\") pod \"auto-csr-approver-29563934-6j5kv\" (UID: \"9d72ac7c-4ce8-4d23-a845-d359bca0544a\") " pod="openshift-infra/auto-csr-approver-29563934-6j5kv" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.341749 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.342634 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ac556a39-897a-44fe-b537-bdaf85c3f437" (UID: "ac556a39-897a-44fe-b537-bdaf85c3f437"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:00 crc kubenswrapper[4975]: E0318 12:14:00.347514 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac556a39-897a-44fe-b537-bdaf85c3f437" containerName="controller-manager" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.347559 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac556a39-897a-44fe-b537-bdaf85c3f437" containerName="controller-manager" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.347707 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac556a39-897a-44fe-b537-bdaf85c3f437" containerName="controller-manager" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.348362 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-config" (OuterVolumeSpecName: "config") pod "ac556a39-897a-44fe-b537-bdaf85c3f437" (UID: "ac556a39-897a-44fe-b537-bdaf85c3f437"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.348572 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.348684 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:14:00 crc kubenswrapper[4975]: E0318 12:14:00.357442 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:14:00.857424771 +0000 UTC m=+226.571825350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.358115 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-5bf2w" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.358542 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac556a39-897a-44fe-b537-bdaf85c3f437-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ac556a39-897a-44fe-b537-bdaf85c3f437" (UID: "ac556a39-897a-44fe-b537-bdaf85c3f437"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.359319 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.359591 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-client-ca" (OuterVolumeSpecName: "client-ca") pod "ac556a39-897a-44fe-b537-bdaf85c3f437" (UID: "ac556a39-897a-44fe-b537-bdaf85c3f437"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.360026 4975 ???:1] "http: TLS handshake error from 192.168.126.11:35278: no serving certificate available for the kubelet" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.364042 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac556a39-897a-44fe-b537-bdaf85c3f437-kube-api-access-nlg74" (OuterVolumeSpecName: "kube-api-access-nlg74") pod "ac556a39-897a-44fe-b537-bdaf85c3f437" (UID: "ac556a39-897a-44fe-b537-bdaf85c3f437"). InnerVolumeSpecName "kube-api-access-nlg74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.376323 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hprxg"] Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.402919 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glfp7\" (UniqueName: \"kubernetes.io/projected/9d72ac7c-4ce8-4d23-a845-d359bca0544a-kube-api-access-glfp7\") pod \"auto-csr-approver-29563934-6j5kv\" (UID: \"9d72ac7c-4ce8-4d23-a845-d359bca0544a\") " pod="openshift-infra/auto-csr-approver-29563934-6j5kv" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.442435 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:14:00 crc kubenswrapper[4975]: E0318 12:14:00.442638 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:14:00.942610969 +0000 UTC m=+226.657011548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.443069 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.443365 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk82v\" (UniqueName: \"kubernetes.io/projected/a7a76930-86ba-4055-85e0-6053832da1aa-kube-api-access-bk82v\") pod \"certified-operators-hprxg\" (UID: \"a7a76930-86ba-4055-85e0-6053832da1aa\") " pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.443430 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7a76930-86ba-4055-85e0-6053832da1aa-catalog-content\") pod \"certified-operators-hprxg\" (UID: \"a7a76930-86ba-4055-85e0-6053832da1aa\") " pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.443522 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7a76930-86ba-4055-85e0-6053832da1aa-utilities\") pod \"certified-operators-hprxg\" (UID: \"a7a76930-86ba-4055-85e0-6053832da1aa\") " pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:14:00 crc kubenswrapper[4975]: E0318 12:14:00.443566 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:14:00.943546784 +0000 UTC m=+226.657947413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.443599 4975 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.443611 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac556a39-897a-44fe-b537-bdaf85c3f437-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.443620 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.443651 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlg74\" (UniqueName: \"kubernetes.io/projected/ac556a39-897a-44fe-b537-bdaf85c3f437-kube-api-access-nlg74\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.443662 4975 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac556a39-897a-44fe-b537-bdaf85c3f437-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.514686 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563934-6j5kv" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.545146 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:14:00 crc kubenswrapper[4975]: E0318 12:14:00.545271 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:14:01.045252022 +0000 UTC m=+226.759652601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.545446 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk82v\" (UniqueName: \"kubernetes.io/projected/a7a76930-86ba-4055-85e0-6053832da1aa-kube-api-access-bk82v\") pod \"certified-operators-hprxg\" (UID: \"a7a76930-86ba-4055-85e0-6053832da1aa\") " pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.545505 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7a76930-86ba-4055-85e0-6053832da1aa-catalog-content\") pod \"certified-operators-hprxg\" (UID: \"a7a76930-86ba-4055-85e0-6053832da1aa\") " pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.545563 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7a76930-86ba-4055-85e0-6053832da1aa-utilities\") pod \"certified-operators-hprxg\" (UID: \"a7a76930-86ba-4055-85e0-6053832da1aa\") " pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.545613 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:14:00 crc kubenswrapper[4975]: E0318 12:14:00.545932 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:14:01.045917679 +0000 UTC m=+226.760318258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.545152 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xnfcw"] Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.547034 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.549360 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7a76930-86ba-4055-85e0-6053832da1aa-catalog-content\") pod \"certified-operators-hprxg\" (UID: \"a7a76930-86ba-4055-85e0-6053832da1aa\") " pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.554376 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7a76930-86ba-4055-85e0-6053832da1aa-utilities\") pod \"certified-operators-hprxg\" (UID: \"a7a76930-86ba-4055-85e0-6053832da1aa\") " pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.554626 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.562145 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xnfcw"] Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.577728 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk82v\" (UniqueName: \"kubernetes.io/projected/a7a76930-86ba-4055-85e0-6053832da1aa-kube-api-access-bk82v\") pod \"certified-operators-hprxg\" (UID: \"a7a76930-86ba-4055-85e0-6053832da1aa\") " pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.646618 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.646821 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b24c4ea-1b55-429c-97f5-376523ea1a52-catalog-content\") pod \"community-operators-xnfcw\" (UID: \"3b24c4ea-1b55-429c-97f5-376523ea1a52\") " pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.646894 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sk2c\" (UniqueName: \"kubernetes.io/projected/3b24c4ea-1b55-429c-97f5-376523ea1a52-kube-api-access-6sk2c\") pod \"community-operators-xnfcw\" (UID: \"3b24c4ea-1b55-429c-97f5-376523ea1a52\") " pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.646945 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b24c4ea-1b55-429c-97f5-376523ea1a52-utilities\") pod \"community-operators-xnfcw\" (UID: \"3b24c4ea-1b55-429c-97f5-376523ea1a52\") " pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:14:00 crc kubenswrapper[4975]: E0318 12:14:00.647043 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:14:01.147026812 +0000 UTC m=+226.861427391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.689127 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" event={"ID":"bb035988-397f-4da3-bc49-d38089014453","Type":"ContainerStarted","Data":"df94e579ba461c41bcd09eba6c7bab453ebdc2ff0f1bfca44cbee7949058f692"} Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.691574 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" event={"ID":"ac556a39-897a-44fe-b537-bdaf85c3f437","Type":"ContainerDied","Data":"72110e8a1ee3e19ff58e3dbe5fd242a608f2627369956ac78a56d3d523cd5155"} Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.691625 4975 scope.go:117] "RemoveContainer" containerID="20f646e3a520e0cb658ea89ed1721d9929eacf1e06f492bd0a93429bddb11ef2" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.691729 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ftkrg" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.697168 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.716473 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.718081 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5" event={"ID":"5c822ed4-6611-4cef-8002-972e9782d403","Type":"ContainerDied","Data":"2f915afce75dd29deb2e2fedd8150b04418ab12947377011bb0a7e319947b426"} Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.749616 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sk2c\" (UniqueName: \"kubernetes.io/projected/3b24c4ea-1b55-429c-97f5-376523ea1a52-kube-api-access-6sk2c\") pod \"community-operators-xnfcw\" (UID: \"3b24c4ea-1b55-429c-97f5-376523ea1a52\") " pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.749710 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b24c4ea-1b55-429c-97f5-376523ea1a52-utilities\") pod \"community-operators-xnfcw\" (UID: \"3b24c4ea-1b55-429c-97f5-376523ea1a52\") " pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.749974 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.750003 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b24c4ea-1b55-429c-97f5-376523ea1a52-catalog-content\") pod \"community-operators-xnfcw\" (UID: \"3b24c4ea-1b55-429c-97f5-376523ea1a52\") " pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.762888 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 12:14:00 crc kubenswrapper[4975]: E0318 12:14:00.804263 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:14:01.304245486 +0000 UTC m=+227.018646065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.805164 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b24c4ea-1b55-429c-97f5-376523ea1a52-catalog-content\") pod \"community-operators-xnfcw\" (UID: \"3b24c4ea-1b55-429c-97f5-376523ea1a52\") " pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.806289 4975 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.807004 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.810340 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.810518 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.810966 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.811822 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.816714 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b24c4ea-1b55-429c-97f5-376523ea1a52-utilities\") pod \"community-operators-xnfcw\" (UID: \"3b24c4ea-1b55-429c-97f5-376523ea1a52\") " pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.817288 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.817473 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.819952 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.826447 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bsnc" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.831770 4975 scope.go:117] "RemoveContainer" containerID="9ec6ec8989ba1c47da0540b033117f1339a0d45fb29c489b13ef82e926556aa3" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.833820 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sk2c\" (UniqueName: \"kubernetes.io/projected/3b24c4ea-1b55-429c-97f5-376523ea1a52-kube-api-access-6sk2c\") pod \"community-operators-xnfcw\" (UID: \"3b24c4ea-1b55-429c-97f5-376523ea1a52\") " pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.849503 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c5xm6"] Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.850571 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.851079 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.851366 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/177a1e9c-68a3-4d50-b462-5d680696d8c7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"177a1e9c-68a3-4d50-b462-5d680696d8c7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.851442 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/177a1e9c-68a3-4d50-b462-5d680696d8c7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"177a1e9c-68a3-4d50-b462-5d680696d8c7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.851500 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0f75321-eeb6-4425-9bab-830f0cab1197-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e0f75321-eeb6-4425-9bab-830f0cab1197\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.851531 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0f75321-eeb6-4425-9bab-830f0cab1197-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e0f75321-eeb6-4425-9bab-830f0cab1197\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:14:00 crc kubenswrapper[4975]: E0318 12:14:00.851642 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:14:01.35162387 +0000 UTC m=+227.066024449 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.860059 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.860126 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c5xm6"] Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.868960 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ftkrg"] Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.879756 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ftkrg"] Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.891847 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.917723 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.924243 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.930980 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps"] Mar 18 12:14:00 crc kubenswrapper[4975]: W0318 12:14:00.943399 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2da51a0_f3f7_4bad_a51c_c45e69dd7dc9.slice/crio-fcb99da4df98066051a5f9a808f61feab55d5deddcc012a75ba3288c71ca5ea7 WatchSource:0}: Error finding container fcb99da4df98066051a5f9a808f61feab55d5deddcc012a75ba3288c71ca5ea7: Status 404 returned error can't find the container with id fcb99da4df98066051a5f9a808f61feab55d5deddcc012a75ba3288c71ca5ea7 Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.952285 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ws7cz" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.953386 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0f75321-eeb6-4425-9bab-830f0cab1197-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e0f75321-eeb6-4425-9bab-830f0cab1197\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.953441 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlsx7\" (UniqueName: \"kubernetes.io/projected/94a0b37e-4423-421c-910e-658cb59e08c8-kube-api-access-wlsx7\") pod \"certified-operators-c5xm6\" (UID: \"94a0b37e-4423-421c-910e-658cb59e08c8\") " pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.953473 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0f75321-eeb6-4425-9bab-830f0cab1197-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e0f75321-eeb6-4425-9bab-830f0cab1197\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.954687 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0f75321-eeb6-4425-9bab-830f0cab1197-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e0f75321-eeb6-4425-9bab-830f0cab1197\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.954885 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.954931 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/177a1e9c-68a3-4d50-b462-5d680696d8c7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"177a1e9c-68a3-4d50-b462-5d680696d8c7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.954978 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a0b37e-4423-421c-910e-658cb59e08c8-catalog-content\") pod \"certified-operators-c5xm6\" (UID: \"94a0b37e-4423-421c-910e-658cb59e08c8\") " pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.955017 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a0b37e-4423-421c-910e-658cb59e08c8-utilities\") pod \"certified-operators-c5xm6\" (UID: \"94a0b37e-4423-421c-910e-658cb59e08c8\") " pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.955103 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/177a1e9c-68a3-4d50-b462-5d680696d8c7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"177a1e9c-68a3-4d50-b462-5d680696d8c7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:14:00 crc kubenswrapper[4975]: E0318 12:14:00.955641 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:14:01.455624668 +0000 UTC m=+227.170025247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:14:00 crc kubenswrapper[4975]: I0318 12:14:00.955689 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/177a1e9c-68a3-4d50-b462-5d680696d8c7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"177a1e9c-68a3-4d50-b462-5d680696d8c7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.041844 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0f75321-eeb6-4425-9bab-830f0cab1197-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e0f75321-eeb6-4425-9bab-830f0cab1197\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.050092 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/177a1e9c-68a3-4d50-b462-5d680696d8c7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"177a1e9c-68a3-4d50-b462-5d680696d8c7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.055756 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.056057 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlsx7\" (UniqueName: \"kubernetes.io/projected/94a0b37e-4423-421c-910e-658cb59e08c8-kube-api-access-wlsx7\") pod \"certified-operators-c5xm6\" (UID: \"94a0b37e-4423-421c-910e-658cb59e08c8\") " pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.056140 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a0b37e-4423-421c-910e-658cb59e08c8-catalog-content\") pod \"certified-operators-c5xm6\" (UID: \"94a0b37e-4423-421c-910e-658cb59e08c8\") " pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.056182 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a0b37e-4423-421c-910e-658cb59e08c8-utilities\") pod \"certified-operators-c5xm6\" (UID: \"94a0b37e-4423-421c-910e-658cb59e08c8\") " pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.056723 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a0b37e-4423-421c-910e-658cb59e08c8-utilities\") pod \"certified-operators-c5xm6\" (UID: \"94a0b37e-4423-421c-910e-658cb59e08c8\") " pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:14:01 crc kubenswrapper[4975]: E0318 12:14:01.057154 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 12:14:01.557133272 +0000 UTC m=+227.271533851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.057699 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a0b37e-4423-421c-910e-658cb59e08c8-catalog-content\") pod \"certified-operators-c5xm6\" (UID: \"94a0b37e-4423-421c-910e-658cb59e08c8\") " pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.085286 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac556a39-897a-44fe-b537-bdaf85c3f437" path="/var/lib/kubelet/pods/ac556a39-897a-44fe-b537-bdaf85c3f437/volumes" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.086235 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-94vnq"] Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.087385 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94vnq"] Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.087500 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.107129 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5"] Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.107468 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.110191 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-j56l5"] Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.123445 4975 patch_prober.go:28] interesting pod/router-default-5444994796-78ldn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:14:01 crc kubenswrapper[4975]: [-]has-synced failed: reason withheld Mar 18 12:14:01 crc kubenswrapper[4975]: [+]process-running ok Mar 18 12:14:01 crc kubenswrapper[4975]: healthz check failed Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.123509 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-78ldn" podUID="f4e1534c-f64b-448a-9b81-7c4192c089f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.137601 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlsx7\" (UniqueName: \"kubernetes.io/projected/94a0b37e-4423-421c-910e-658cb59e08c8-kube-api-access-wlsx7\") pod \"certified-operators-c5xm6\" (UID: \"94a0b37e-4423-421c-910e-658cb59e08c8\") " pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.160939 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9002f360-1ea5-4b24-a49a-69f46a658936-catalog-content\") pod \"community-operators-94vnq\" (UID: \"9002f360-1ea5-4b24-a49a-69f46a658936\") " pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.161699 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9002f360-1ea5-4b24-a49a-69f46a658936-utilities\") pod \"community-operators-94vnq\" (UID: \"9002f360-1ea5-4b24-a49a-69f46a658936\") " pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.161743 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr6ms\" (UniqueName: \"kubernetes.io/projected/9002f360-1ea5-4b24-a49a-69f46a658936-kube-api-access-qr6ms\") pod \"community-operators-94vnq\" (UID: \"9002f360-1ea5-4b24-a49a-69f46a658936\") " pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:14:01 crc kubenswrapper[4975]: E0318 12:14:01.170822 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 12:14:01.670802942 +0000 UTC m=+227.385203521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8nsht" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.187796 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.228299 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.161854 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.246270 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.282483 4975 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-18T12:14:00.80631887Z","Handler":null,"Name":""} Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.285965 4975 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.285997 4975 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.301106 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b6c454647-vr5vq"] Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.302011 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.305530 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.306838 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.307368 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.307534 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.317320 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.317627 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.318186 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.342460 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.342962 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9002f360-1ea5-4b24-a49a-69f46a658936-catalog-content\") pod \"community-operators-94vnq\" (UID: \"9002f360-1ea5-4b24-a49a-69f46a658936\") " pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.343063 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9002f360-1ea5-4b24-a49a-69f46a658936-utilities\") pod \"community-operators-94vnq\" (UID: \"9002f360-1ea5-4b24-a49a-69f46a658936\") " pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.343101 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr6ms\" (UniqueName: \"kubernetes.io/projected/9002f360-1ea5-4b24-a49a-69f46a658936-kube-api-access-qr6ms\") pod \"community-operators-94vnq\" (UID: \"9002f360-1ea5-4b24-a49a-69f46a658936\") " pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.344153 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9002f360-1ea5-4b24-a49a-69f46a658936-catalog-content\") pod \"community-operators-94vnq\" (UID: \"9002f360-1ea5-4b24-a49a-69f46a658936\") " pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.344427 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9002f360-1ea5-4b24-a49a-69f46a658936-utilities\") pod \"community-operators-94vnq\" (UID: \"9002f360-1ea5-4b24-a49a-69f46a658936\") " pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.376213 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b6c454647-vr5vq"] Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.376465 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.381633 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lxtks" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.382215 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr6ms\" (UniqueName: \"kubernetes.io/projected/9002f360-1ea5-4b24-a49a-69f46a658936-kube-api-access-qr6ms\") pod \"community-operators-94vnq\" (UID: \"9002f360-1ea5-4b24-a49a-69f46a658936\") " pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.447263 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-client-ca\") pod \"controller-manager-7b6c454647-vr5vq\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.447610 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.447633 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-config\") pod \"controller-manager-7b6c454647-vr5vq\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.447653 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v4kn\" (UniqueName: \"kubernetes.io/projected/f847fa98-6ca4-4087-aeb0-9d70fab215f0-kube-api-access-5v4kn\") pod \"controller-manager-7b6c454647-vr5vq\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.447691 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f847fa98-6ca4-4087-aeb0-9d70fab215f0-serving-cert\") pod \"controller-manager-7b6c454647-vr5vq\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.447889 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-proxy-ca-bundles\") pod \"controller-manager-7b6c454647-vr5vq\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.480499 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.532327 4975 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.532368 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.559663 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-proxy-ca-bundles\") pod \"controller-manager-7b6c454647-vr5vq\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.559817 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-client-ca\") pod \"controller-manager-7b6c454647-vr5vq\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.559852 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-config\") pod \"controller-manager-7b6c454647-vr5vq\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.559898 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v4kn\" (UniqueName: \"kubernetes.io/projected/f847fa98-6ca4-4087-aeb0-9d70fab215f0-kube-api-access-5v4kn\") pod \"controller-manager-7b6c454647-vr5vq\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.559982 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f847fa98-6ca4-4087-aeb0-9d70fab215f0-serving-cert\") pod \"controller-manager-7b6c454647-vr5vq\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.561639 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-client-ca\") pod \"controller-manager-7b6c454647-vr5vq\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.563169 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-proxy-ca-bundles\") pod \"controller-manager-7b6c454647-vr5vq\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.564249 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-config\") pod \"controller-manager-7b6c454647-vr5vq\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.574940 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f847fa98-6ca4-4087-aeb0-9d70fab215f0-serving-cert\") pod \"controller-manager-7b6c454647-vr5vq\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.594397 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-r26rl" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.613531 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v4kn\" (UniqueName: \"kubernetes.io/projected/f847fa98-6ca4-4087-aeb0-9d70fab215f0-kube-api-access-5v4kn\") pod \"controller-manager-7b6c454647-vr5vq\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.691485 4975 patch_prober.go:28] interesting pod/apiserver-76f77b778f-m27h4 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 18 12:14:01 crc kubenswrapper[4975]: [+]log ok Mar 18 12:14:01 crc kubenswrapper[4975]: [+]etcd ok Mar 18 12:14:01 crc kubenswrapper[4975]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 18 12:14:01 crc kubenswrapper[4975]: [+]poststarthook/generic-apiserver-start-informers ok Mar 18 12:14:01 crc kubenswrapper[4975]: [+]poststarthook/max-in-flight-filter ok Mar 18 12:14:01 crc kubenswrapper[4975]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 18 12:14:01 crc kubenswrapper[4975]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 18 12:14:01 crc kubenswrapper[4975]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 18 12:14:01 crc kubenswrapper[4975]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 18 12:14:01 crc kubenswrapper[4975]: [+]poststarthook/project.openshift.io-projectcache ok Mar 18 12:14:01 crc kubenswrapper[4975]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 18 12:14:01 crc kubenswrapper[4975]: [+]poststarthook/openshift.io-startinformers ok Mar 18 12:14:01 crc kubenswrapper[4975]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 18 12:14:01 crc kubenswrapper[4975]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 18 12:14:01 crc kubenswrapper[4975]: livez check failed Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.691927 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-m27h4" podUID="bbf4b50a-de39-4b86-b0ca-883ba11d6e4b" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.700409 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8nsht\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.736669 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" event={"ID":"bb035988-397f-4da3-bc49-d38089014453","Type":"ContainerStarted","Data":"edecd5ec8ab8c4a675c3de120d56b4adf275483d7fe42ca51e5cda5c6f85627a"} Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.745208 4975 generic.go:334] "Generic (PLEG): container finished" podID="79bb26ca-fc97-4b59-ab56-12637c684208" containerID="c71aa7bef889bc62ef542514ceaf515d7181e991e89e92e6d7a87de6dde539d6" exitCode=0 Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.745603 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" event={"ID":"79bb26ca-fc97-4b59-ab56-12637c684208","Type":"ContainerDied","Data":"c71aa7bef889bc62ef542514ceaf515d7181e991e89e92e6d7a87de6dde539d6"} Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.745913 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.794249 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" event={"ID":"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9","Type":"ContainerStarted","Data":"75463ea76d7799d31baf3a374fa726ce8ad6a23bbd621fb5d33f307c926355a6"} Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.794296 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" event={"ID":"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9","Type":"ContainerStarted","Data":"fcb99da4df98066051a5f9a808f61feab55d5deddcc012a75ba3288c71ca5ea7"} Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.795498 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.815489 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563934-6j5kv"] Mar 18 12:14:01 crc kubenswrapper[4975]: I0318 12:14:01.822035 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" podStartSLOduration=2.822015251 podStartE2EDuration="2.822015251s" podCreationTimestamp="2026-03-18 12:13:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:01.819064344 +0000 UTC m=+227.533464923" watchObservedRunningTime="2026-03-18 12:14:01.822015251 +0000 UTC m=+227.536415830" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:01.921770 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.093681 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.295739 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xnfcw"] Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.311029 4975 patch_prober.go:28] interesting pod/router-default-5444994796-78ldn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:14:02 crc kubenswrapper[4975]: [-]has-synced failed: reason withheld Mar 18 12:14:02 crc kubenswrapper[4975]: [+]process-running ok Mar 18 12:14:02 crc kubenswrapper[4975]: healthz check failed Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.311084 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-78ldn" podUID="f4e1534c-f64b-448a-9b81-7c4192c089f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.340468 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.371892 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hprxg"] Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.373667 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c5xm6"] Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.541247 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-whr68"] Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.542981 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.552105 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.571397 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b9dc77-7653-4684-ba67-cece256c42e2-utilities\") pod \"redhat-marketplace-whr68\" (UID: \"21b9dc77-7653-4684-ba67-cece256c42e2\") " pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.571499 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b9dc77-7653-4684-ba67-cece256c42e2-catalog-content\") pod \"redhat-marketplace-whr68\" (UID: \"21b9dc77-7653-4684-ba67-cece256c42e2\") " pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.571530 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hpnd\" (UniqueName: \"kubernetes.io/projected/21b9dc77-7653-4684-ba67-cece256c42e2-kube-api-access-8hpnd\") pod \"redhat-marketplace-whr68\" (UID: \"21b9dc77-7653-4684-ba67-cece256c42e2\") " pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.575788 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-whr68"] Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.634776 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94vnq"] Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.661699 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 12:14:02 crc kubenswrapper[4975]: W0318 12:14:02.664051 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9002f360_1ea5_4b24_a49a_69f46a658936.slice/crio-7d0fc2e9673a2da875e7aa58b6317983fed76cb9d3f922b9c3d95b645281ebf9 WatchSource:0}: Error finding container 7d0fc2e9673a2da875e7aa58b6317983fed76cb9d3f922b9c3d95b645281ebf9: Status 404 returned error can't find the container with id 7d0fc2e9673a2da875e7aa58b6317983fed76cb9d3f922b9c3d95b645281ebf9 Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.677501 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b9dc77-7653-4684-ba67-cece256c42e2-utilities\") pod \"redhat-marketplace-whr68\" (UID: \"21b9dc77-7653-4684-ba67-cece256c42e2\") " pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.677649 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b9dc77-7653-4684-ba67-cece256c42e2-catalog-content\") pod \"redhat-marketplace-whr68\" (UID: \"21b9dc77-7653-4684-ba67-cece256c42e2\") " pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.677670 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hpnd\" (UniqueName: \"kubernetes.io/projected/21b9dc77-7653-4684-ba67-cece256c42e2-kube-api-access-8hpnd\") pod \"redhat-marketplace-whr68\" (UID: \"21b9dc77-7653-4684-ba67-cece256c42e2\") " pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.678188 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b9dc77-7653-4684-ba67-cece256c42e2-catalog-content\") pod \"redhat-marketplace-whr68\" (UID: \"21b9dc77-7653-4684-ba67-cece256c42e2\") " pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.678320 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b9dc77-7653-4684-ba67-cece256c42e2-utilities\") pod \"redhat-marketplace-whr68\" (UID: \"21b9dc77-7653-4684-ba67-cece256c42e2\") " pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.729853 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hpnd\" (UniqueName: \"kubernetes.io/projected/21b9dc77-7653-4684-ba67-cece256c42e2-kube-api-access-8hpnd\") pod \"redhat-marketplace-whr68\" (UID: \"21b9dc77-7653-4684-ba67-cece256c42e2\") " pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.846976 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b6c454647-vr5vq"] Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.860225 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" event={"ID":"bb035988-397f-4da3-bc49-d38089014453","Type":"ContainerStarted","Data":"c78e6e074bf5d60a9a100ff9b37602602cf13e84e2bfad7ca8bd0b66ce6f2f72"} Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.861710 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8nsht"] Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.862529 4975 generic.go:334] "Generic (PLEG): container finished" podID="94a0b37e-4423-421c-910e-658cb59e08c8" containerID="9c8d7b22644cfa3e6e6bf88b9a83665cf9884b9f63a450f8831182f608634412" exitCode=0 Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.862591 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5xm6" event={"ID":"94a0b37e-4423-421c-910e-658cb59e08c8","Type":"ContainerDied","Data":"9c8d7b22644cfa3e6e6bf88b9a83665cf9884b9f63a450f8831182f608634412"} Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.862719 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5xm6" event={"ID":"94a0b37e-4423-421c-910e-658cb59e08c8","Type":"ContainerStarted","Data":"dcc9a37d62286a9df3c0b6ef6470633a426b21cbc39170c9d82bcbeb8baa4015"} Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.864782 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.866944 4975 generic.go:334] "Generic (PLEG): container finished" podID="3b24c4ea-1b55-429c-97f5-376523ea1a52" containerID="2225529e0d8c02842364d0752e7aed1b13ebb30a58e16aa9d261b637a2fa1aae" exitCode=0 Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.867009 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnfcw" event={"ID":"3b24c4ea-1b55-429c-97f5-376523ea1a52","Type":"ContainerDied","Data":"2225529e0d8c02842364d0752e7aed1b13ebb30a58e16aa9d261b637a2fa1aae"} Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.867037 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnfcw" event={"ID":"3b24c4ea-1b55-429c-97f5-376523ea1a52","Type":"ContainerStarted","Data":"42130e79785555e3233872cfc95b694c4787bcc19e031079f15b0f4f9835a4be"} Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.879466 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563934-6j5kv" event={"ID":"9d72ac7c-4ce8-4d23-a845-d359bca0544a","Type":"ContainerStarted","Data":"44a06719371d84506167442d291773036d152b91c6bc025d7ce20b7a99da7621"} Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.886002 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"177a1e9c-68a3-4d50-b462-5d680696d8c7","Type":"ContainerStarted","Data":"c1c4c4dd6138ce1a8db9d768be8716110c27ea93e4752f074b5efce3e8e7aacb"} Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.886341 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-v7w9z" podStartSLOduration=15.886325557 podStartE2EDuration="15.886325557s" podCreationTimestamp="2026-03-18 12:13:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:02.882664892 +0000 UTC m=+228.597065481" watchObservedRunningTime="2026-03-18 12:14:02.886325557 +0000 UTC m=+228.600726136" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.887960 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94vnq" event={"ID":"9002f360-1ea5-4b24-a49a-69f46a658936","Type":"ContainerStarted","Data":"7d0fc2e9673a2da875e7aa58b6317983fed76cb9d3f922b9c3d95b645281ebf9"} Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.891805 4975 generic.go:334] "Generic (PLEG): container finished" podID="a7a76930-86ba-4055-85e0-6053832da1aa" containerID="6ea0e79bd7cb8286af22f95f08a33d7cde8ae8b2305782d45327129030874fbe" exitCode=0 Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.891942 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hprxg" event={"ID":"a7a76930-86ba-4055-85e0-6053832da1aa","Type":"ContainerDied","Data":"6ea0e79bd7cb8286af22f95f08a33d7cde8ae8b2305782d45327129030874fbe"} Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.891975 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hprxg" event={"ID":"a7a76930-86ba-4055-85e0-6053832da1aa","Type":"ContainerStarted","Data":"5da695589575d82110ad0e83ba18f6b3798510831f8a300f3ba9184cab96f7c6"} Mar 18 12:14:02 crc kubenswrapper[4975]: W0318 12:14:02.901608 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod470110ba_97b8_4d8f_a8da_0df16cd7abed.slice/crio-cdc38385728bbce1ca828ff7d51d70b3ec0e966bc0d25a7e54546ebe8fa46a84 WatchSource:0}: Error finding container cdc38385728bbce1ca828ff7d51d70b3ec0e966bc0d25a7e54546ebe8fa46a84: Status 404 returned error can't find the container with id cdc38385728bbce1ca828ff7d51d70b3ec0e966bc0d25a7e54546ebe8fa46a84 Mar 18 12:14:02 crc kubenswrapper[4975]: W0318 12:14:02.905891 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf847fa98_6ca4_4087_aeb0_9d70fab215f0.slice/crio-50faa0adf2a7e4c92f2e61a67a74ea0c4fcb9e52c98710cbd25f3fe273b6794f WatchSource:0}: Error finding container 50faa0adf2a7e4c92f2e61a67a74ea0c4fcb9e52c98710cbd25f3fe273b6794f: Status 404 returned error can't find the container with id 50faa0adf2a7e4c92f2e61a67a74ea0c4fcb9e52c98710cbd25f3fe273b6794f Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.906091 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e0f75321-eeb6-4425-9bab-830f0cab1197","Type":"ContainerStarted","Data":"df4aa2c0e09b47de6098024018a2107dd69a2262acb2741e90a8196f25b5f059"} Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.941794 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f75bb"] Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.944564 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.949876 4975 ???:1] "http: TLS handshake error from 192.168.126.11:35280: no serving certificate available for the kubelet" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.951103 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f75bb"] Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.983836 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66pqw\" (UniqueName: \"kubernetes.io/projected/46171d59-3549-4843-b6eb-07b9eecd2560-kube-api-access-66pqw\") pod \"redhat-marketplace-f75bb\" (UID: \"46171d59-3549-4843-b6eb-07b9eecd2560\") " pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.983991 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46171d59-3549-4843-b6eb-07b9eecd2560-catalog-content\") pod \"redhat-marketplace-f75bb\" (UID: \"46171d59-3549-4843-b6eb-07b9eecd2560\") " pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:14:02 crc kubenswrapper[4975]: I0318 12:14:02.984940 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46171d59-3549-4843-b6eb-07b9eecd2560-utilities\") pod \"redhat-marketplace-f75bb\" (UID: \"46171d59-3549-4843-b6eb-07b9eecd2560\") " pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.062466 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c822ed4-6611-4cef-8002-972e9782d403" path="/var/lib/kubelet/pods/5c822ed4-6611-4cef-8002-972e9782d403/volumes" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.063401 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.086095 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46171d59-3549-4843-b6eb-07b9eecd2560-catalog-content\") pod \"redhat-marketplace-f75bb\" (UID: \"46171d59-3549-4843-b6eb-07b9eecd2560\") " pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.086202 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46171d59-3549-4843-b6eb-07b9eecd2560-utilities\") pod \"redhat-marketplace-f75bb\" (UID: \"46171d59-3549-4843-b6eb-07b9eecd2560\") " pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.086267 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66pqw\" (UniqueName: \"kubernetes.io/projected/46171d59-3549-4843-b6eb-07b9eecd2560-kube-api-access-66pqw\") pod \"redhat-marketplace-f75bb\" (UID: \"46171d59-3549-4843-b6eb-07b9eecd2560\") " pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.087206 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46171d59-3549-4843-b6eb-07b9eecd2560-catalog-content\") pod \"redhat-marketplace-f75bb\" (UID: \"46171d59-3549-4843-b6eb-07b9eecd2560\") " pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.088998 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46171d59-3549-4843-b6eb-07b9eecd2560-utilities\") pod \"redhat-marketplace-f75bb\" (UID: \"46171d59-3549-4843-b6eb-07b9eecd2560\") " pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.108858 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66pqw\" (UniqueName: \"kubernetes.io/projected/46171d59-3549-4843-b6eb-07b9eecd2560-kube-api-access-66pqw\") pod \"redhat-marketplace-f75bb\" (UID: \"46171d59-3549-4843-b6eb-07b9eecd2560\") " pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.113833 4975 patch_prober.go:28] interesting pod/router-default-5444994796-78ldn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:14:03 crc kubenswrapper[4975]: [-]has-synced failed: reason withheld Mar 18 12:14:03 crc kubenswrapper[4975]: [+]process-running ok Mar 18 12:14:03 crc kubenswrapper[4975]: healthz check failed Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.113963 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-78ldn" podUID="f4e1534c-f64b-448a-9b81-7c4192c089f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.215459 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-whr68"] Mar 18 12:14:03 crc kubenswrapper[4975]: W0318 12:14:03.242532 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21b9dc77_7653_4684_ba67_cece256c42e2.slice/crio-c162369492277f2e0c64f8c6d6ef95fa0e4913e4286ac519e8b34dde287e44a0 WatchSource:0}: Error finding container c162369492277f2e0c64f8c6d6ef95fa0e4913e4286ac519e8b34dde287e44a0: Status 404 returned error can't find the container with id c162369492277f2e0c64f8c6d6ef95fa0e4913e4286ac519e8b34dde287e44a0 Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.252638 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.288793 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79bb26ca-fc97-4b59-ab56-12637c684208-config-volume\") pod \"79bb26ca-fc97-4b59-ab56-12637c684208\" (UID: \"79bb26ca-fc97-4b59-ab56-12637c684208\") " Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.288877 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzlnf\" (UniqueName: \"kubernetes.io/projected/79bb26ca-fc97-4b59-ab56-12637c684208-kube-api-access-rzlnf\") pod \"79bb26ca-fc97-4b59-ab56-12637c684208\" (UID: \"79bb26ca-fc97-4b59-ab56-12637c684208\") " Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.288954 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79bb26ca-fc97-4b59-ab56-12637c684208-secret-volume\") pod \"79bb26ca-fc97-4b59-ab56-12637c684208\" (UID: \"79bb26ca-fc97-4b59-ab56-12637c684208\") " Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.289692 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79bb26ca-fc97-4b59-ab56-12637c684208-config-volume" (OuterVolumeSpecName: "config-volume") pod "79bb26ca-fc97-4b59-ab56-12637c684208" (UID: "79bb26ca-fc97-4b59-ab56-12637c684208"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.296447 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79bb26ca-fc97-4b59-ab56-12637c684208-kube-api-access-rzlnf" (OuterVolumeSpecName: "kube-api-access-rzlnf") pod "79bb26ca-fc97-4b59-ab56-12637c684208" (UID: "79bb26ca-fc97-4b59-ab56-12637c684208"). InnerVolumeSpecName "kube-api-access-rzlnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.317553 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.427195 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79bb26ca-fc97-4b59-ab56-12637c684208-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "79bb26ca-fc97-4b59-ab56-12637c684208" (UID: "79bb26ca-fc97-4b59-ab56-12637c684208"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.429310 4975 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79bb26ca-fc97-4b59-ab56-12637c684208-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.429346 4975 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79bb26ca-fc97-4b59-ab56-12637c684208-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.429358 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzlnf\" (UniqueName: \"kubernetes.io/projected/79bb26ca-fc97-4b59-ab56-12637c684208-kube-api-access-rzlnf\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.530381 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dswgh"] Mar 18 12:14:03 crc kubenswrapper[4975]: E0318 12:14:03.530647 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bb26ca-fc97-4b59-ab56-12637c684208" containerName="collect-profiles" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.530662 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bb26ca-fc97-4b59-ab56-12637c684208" containerName="collect-profiles" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.530799 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="79bb26ca-fc97-4b59-ab56-12637c684208" containerName="collect-profiles" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.531660 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.539388 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.558701 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dswgh"] Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.735958 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698cd02e-0279-4ae7-be21-bd479b2dfe49-catalog-content\") pod \"redhat-operators-dswgh\" (UID: \"698cd02e-0279-4ae7-be21-bd479b2dfe49\") " pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.736058 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698cd02e-0279-4ae7-be21-bd479b2dfe49-utilities\") pod \"redhat-operators-dswgh\" (UID: \"698cd02e-0279-4ae7-be21-bd479b2dfe49\") " pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.736160 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db4fl\" (UniqueName: \"kubernetes.io/projected/698cd02e-0279-4ae7-be21-bd479b2dfe49-kube-api-access-db4fl\") pod \"redhat-operators-dswgh\" (UID: \"698cd02e-0279-4ae7-be21-bd479b2dfe49\") " pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.838343 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698cd02e-0279-4ae7-be21-bd479b2dfe49-utilities\") pod \"redhat-operators-dswgh\" (UID: \"698cd02e-0279-4ae7-be21-bd479b2dfe49\") " pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.838495 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db4fl\" (UniqueName: \"kubernetes.io/projected/698cd02e-0279-4ae7-be21-bd479b2dfe49-kube-api-access-db4fl\") pod \"redhat-operators-dswgh\" (UID: \"698cd02e-0279-4ae7-be21-bd479b2dfe49\") " pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.838531 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698cd02e-0279-4ae7-be21-bd479b2dfe49-catalog-content\") pod \"redhat-operators-dswgh\" (UID: \"698cd02e-0279-4ae7-be21-bd479b2dfe49\") " pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.839296 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698cd02e-0279-4ae7-be21-bd479b2dfe49-catalog-content\") pod \"redhat-operators-dswgh\" (UID: \"698cd02e-0279-4ae7-be21-bd479b2dfe49\") " pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.839580 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698cd02e-0279-4ae7-be21-bd479b2dfe49-utilities\") pod \"redhat-operators-dswgh\" (UID: \"698cd02e-0279-4ae7-be21-bd479b2dfe49\") " pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.864407 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db4fl\" (UniqueName: \"kubernetes.io/projected/698cd02e-0279-4ae7-be21-bd479b2dfe49-kube-api-access-db4fl\") pod \"redhat-operators-dswgh\" (UID: \"698cd02e-0279-4ae7-be21-bd479b2dfe49\") " pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.939912 4975 generic.go:334] "Generic (PLEG): container finished" podID="9002f360-1ea5-4b24-a49a-69f46a658936" containerID="af39051049ba1ad9f3198878892e841f84a0ac608bb1f5e2843077e815f928ef" exitCode=0 Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.940055 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94vnq" event={"ID":"9002f360-1ea5-4b24-a49a-69f46a658936","Type":"ContainerDied","Data":"af39051049ba1ad9f3198878892e841f84a0ac608bb1f5e2843077e815f928ef"} Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.949365 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lbw76"] Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.953160 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" event={"ID":"470110ba-97b8-4d8f-a8da-0df16cd7abed","Type":"ContainerStarted","Data":"2d2880fde3dee7514ded4039414b9897c12e6fe58953347d1f740dda0fadde26"} Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.953191 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" event={"ID":"470110ba-97b8-4d8f-a8da-0df16cd7abed","Type":"ContainerStarted","Data":"cdc38385728bbce1ca828ff7d51d70b3ec0e966bc0d25a7e54546ebe8fa46a84"} Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.953206 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.953216 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e0f75321-eeb6-4425-9bab-830f0cab1197","Type":"ContainerStarted","Data":"e5cc6816f42c50d1065c238ad7bfdda09a00641cee7c4508f6eeb5555bbbf48f"} Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.953304 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.963653 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" event={"ID":"f847fa98-6ca4-4087-aeb0-9d70fab215f0","Type":"ContainerStarted","Data":"a993ebb937fd3971825cf2f27423738e063975bc1dedc75f9aed7c8e3c46ac5f"} Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.963713 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" event={"ID":"f847fa98-6ca4-4087-aeb0-9d70fab215f0","Type":"ContainerStarted","Data":"50faa0adf2a7e4c92f2e61a67a74ea0c4fcb9e52c98710cbd25f3fe273b6794f"} Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.963987 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lbw76"] Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.964404 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.982664 4975 generic.go:334] "Generic (PLEG): container finished" podID="21b9dc77-7653-4684-ba67-cece256c42e2" containerID="cbcc2ffbdc5b00929d64ffab41e848866ccf45f9c53a109590a05be3aacb052e" exitCode=0 Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.982756 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whr68" event={"ID":"21b9dc77-7653-4684-ba67-cece256c42e2","Type":"ContainerDied","Data":"cbcc2ffbdc5b00929d64ffab41e848866ccf45f9c53a109590a05be3aacb052e"} Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.982816 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whr68" event={"ID":"21b9dc77-7653-4684-ba67-cece256c42e2","Type":"ContainerStarted","Data":"c162369492277f2e0c64f8c6d6ef95fa0e4913e4286ac519e8b34dde287e44a0"} Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.984816 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.994154 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.994140 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn" event={"ID":"79bb26ca-fc97-4b59-ab56-12637c684208","Type":"ContainerDied","Data":"d25991203151fa711ede48fda6284c3fc920b7370dad7e942258d28cf002800a"} Mar 18 12:14:03 crc kubenswrapper[4975]: I0318 12:14:03.994340 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d25991203151fa711ede48fda6284c3fc920b7370dad7e942258d28cf002800a" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.023384 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"177a1e9c-68a3-4d50-b462-5d680696d8c7","Type":"ContainerStarted","Data":"ee3e139dfaa87b261f60bcb054e888deaadea2cf73e4fc5c1dfa0c2a19f9eed8"} Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.029504 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.029480277 podStartE2EDuration="4.029480277s" podCreationTimestamp="2026-03-18 12:14:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:04.027807514 +0000 UTC m=+229.742208093" watchObservedRunningTime="2026-03-18 12:14:04.029480277 +0000 UTC m=+229.743880866" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.029889 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" podStartSLOduration=180.029883248 podStartE2EDuration="3m0.029883248s" podCreationTimestamp="2026-03-18 12:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:04.002724871 +0000 UTC m=+229.717125470" watchObservedRunningTime="2026-03-18 12:14:04.029883248 +0000 UTC m=+229.744283827" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.047102 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q6r4\" (UniqueName: \"kubernetes.io/projected/cedb84ff-ae71-4210-8f65-16441f4292ac-kube-api-access-7q6r4\") pod \"redhat-operators-lbw76\" (UID: \"cedb84ff-ae71-4210-8f65-16441f4292ac\") " pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.047222 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cedb84ff-ae71-4210-8f65-16441f4292ac-catalog-content\") pod \"redhat-operators-lbw76\" (UID: \"cedb84ff-ae71-4210-8f65-16441f4292ac\") " pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.047287 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cedb84ff-ae71-4210-8f65-16441f4292ac-utilities\") pod \"redhat-operators-lbw76\" (UID: \"cedb84ff-ae71-4210-8f65-16441f4292ac\") " pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.055685 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" podStartSLOduration=5.055665219 podStartE2EDuration="5.055665219s" podCreationTimestamp="2026-03-18 12:13:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:04.051434859 +0000 UTC m=+229.765835438" watchObservedRunningTime="2026-03-18 12:14:04.055665219 +0000 UTC m=+229.770065798" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.111518 4975 patch_prober.go:28] interesting pod/router-default-5444994796-78ldn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:14:04 crc kubenswrapper[4975]: [-]has-synced failed: reason withheld Mar 18 12:14:04 crc kubenswrapper[4975]: [+]process-running ok Mar 18 12:14:04 crc kubenswrapper[4975]: healthz check failed Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.111997 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-78ldn" podUID="f4e1534c-f64b-448a-9b81-7c4192c089f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.159224 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.177725 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cedb84ff-ae71-4210-8f65-16441f4292ac-utilities\") pod \"redhat-operators-lbw76\" (UID: \"cedb84ff-ae71-4210-8f65-16441f4292ac\") " pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.177889 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q6r4\" (UniqueName: \"kubernetes.io/projected/cedb84ff-ae71-4210-8f65-16441f4292ac-kube-api-access-7q6r4\") pod \"redhat-operators-lbw76\" (UID: \"cedb84ff-ae71-4210-8f65-16441f4292ac\") " pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.177944 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cedb84ff-ae71-4210-8f65-16441f4292ac-catalog-content\") pod \"redhat-operators-lbw76\" (UID: \"cedb84ff-ae71-4210-8f65-16441f4292ac\") " pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.178469 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cedb84ff-ae71-4210-8f65-16441f4292ac-catalog-content\") pod \"redhat-operators-lbw76\" (UID: \"cedb84ff-ae71-4210-8f65-16441f4292ac\") " pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.179391 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cedb84ff-ae71-4210-8f65-16441f4292ac-utilities\") pod \"redhat-operators-lbw76\" (UID: \"cedb84ff-ae71-4210-8f65-16441f4292ac\") " pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.188557 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=4.188533709 podStartE2EDuration="4.188533709s" podCreationTimestamp="2026-03-18 12:14:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:04.182672427 +0000 UTC m=+229.897073006" watchObservedRunningTime="2026-03-18 12:14:04.188533709 +0000 UTC m=+229.902934288" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.230507 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q6r4\" (UniqueName: \"kubernetes.io/projected/cedb84ff-ae71-4210-8f65-16441f4292ac-kube-api-access-7q6r4\") pod \"redhat-operators-lbw76\" (UID: \"cedb84ff-ae71-4210-8f65-16441f4292ac\") " pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.234255 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f75bb"] Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.318172 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.862369 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dswgh"] Mar 18 12:14:04 crc kubenswrapper[4975]: W0318 12:14:04.894595 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod698cd02e_0279_4ae7_be21_bd479b2dfe49.slice/crio-c37625ee2b3a332206d1d947dfd1158396412d520d3a9e7b4ed769228168e99a WatchSource:0}: Error finding container c37625ee2b3a332206d1d947dfd1158396412d520d3a9e7b4ed769228168e99a: Status 404 returned error can't find the container with id c37625ee2b3a332206d1d947dfd1158396412d520d3a9e7b4ed769228168e99a Mar 18 12:14:04 crc kubenswrapper[4975]: I0318 12:14:04.922762 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lbw76"] Mar 18 12:14:05 crc kubenswrapper[4975]: I0318 12:14:05.101745 4975 generic.go:334] "Generic (PLEG): container finished" podID="e0f75321-eeb6-4425-9bab-830f0cab1197" containerID="e5cc6816f42c50d1065c238ad7bfdda09a00641cee7c4508f6eeb5555bbbf48f" exitCode=0 Mar 18 12:14:05 crc kubenswrapper[4975]: I0318 12:14:05.101933 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e0f75321-eeb6-4425-9bab-830f0cab1197","Type":"ContainerDied","Data":"e5cc6816f42c50d1065c238ad7bfdda09a00641cee7c4508f6eeb5555bbbf48f"} Mar 18 12:14:05 crc kubenswrapper[4975]: I0318 12:14:05.116980 4975 patch_prober.go:28] interesting pod/router-default-5444994796-78ldn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:14:05 crc kubenswrapper[4975]: [-]has-synced failed: reason withheld Mar 18 12:14:05 crc kubenswrapper[4975]: [+]process-running ok Mar 18 12:14:05 crc kubenswrapper[4975]: healthz check failed Mar 18 12:14:05 crc kubenswrapper[4975]: I0318 12:14:05.117047 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-78ldn" podUID="f4e1534c-f64b-448a-9b81-7c4192c089f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:14:05 crc kubenswrapper[4975]: I0318 12:14:05.118598 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:14:05 crc kubenswrapper[4975]: I0318 12:14:05.126047 4975 generic.go:334] "Generic (PLEG): container finished" podID="46171d59-3549-4843-b6eb-07b9eecd2560" containerID="bb8e2327d4e9a55195ce477ceb4b071a32bcdbf68f7f687fa160837f468b5e0f" exitCode=0 Mar 18 12:14:05 crc kubenswrapper[4975]: I0318 12:14:05.126218 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f75bb" event={"ID":"46171d59-3549-4843-b6eb-07b9eecd2560","Type":"ContainerDied","Data":"bb8e2327d4e9a55195ce477ceb4b071a32bcdbf68f7f687fa160837f468b5e0f"} Mar 18 12:14:05 crc kubenswrapper[4975]: I0318 12:14:05.126266 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f75bb" event={"ID":"46171d59-3549-4843-b6eb-07b9eecd2560","Type":"ContainerStarted","Data":"4f447c6a072ea8793c553ead6eff49fcccb74d794cdc5d019bd35d7d6d90027c"} Mar 18 12:14:05 crc kubenswrapper[4975]: I0318 12:14:05.132190 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-m27h4" Mar 18 12:14:05 crc kubenswrapper[4975]: I0318 12:14:05.145979 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dswgh" event={"ID":"698cd02e-0279-4ae7-be21-bd479b2dfe49","Type":"ContainerStarted","Data":"c37625ee2b3a332206d1d947dfd1158396412d520d3a9e7b4ed769228168e99a"} Mar 18 12:14:05 crc kubenswrapper[4975]: I0318 12:14:05.171283 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbw76" event={"ID":"cedb84ff-ae71-4210-8f65-16441f4292ac","Type":"ContainerStarted","Data":"ecc209509b0446a559b724aad0d902836c816113e2e6cdf3ccb6bec44fab0b54"} Mar 18 12:14:05 crc kubenswrapper[4975]: I0318 12:14:05.176750 4975 generic.go:334] "Generic (PLEG): container finished" podID="177a1e9c-68a3-4d50-b462-5d680696d8c7" containerID="ee3e139dfaa87b261f60bcb054e888deaadea2cf73e4fc5c1dfa0c2a19f9eed8" exitCode=0 Mar 18 12:14:05 crc kubenswrapper[4975]: I0318 12:14:05.176829 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"177a1e9c-68a3-4d50-b462-5d680696d8c7","Type":"ContainerDied","Data":"ee3e139dfaa87b261f60bcb054e888deaadea2cf73e4fc5c1dfa0c2a19f9eed8"} Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.105082 4975 patch_prober.go:28] interesting pod/router-default-5444994796-78ldn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 12:14:06 crc kubenswrapper[4975]: [-]has-synced failed: reason withheld Mar 18 12:14:06 crc kubenswrapper[4975]: [+]process-running ok Mar 18 12:14:06 crc kubenswrapper[4975]: healthz check failed Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.105151 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-78ldn" podUID="f4e1534c-f64b-448a-9b81-7c4192c089f3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.201939 4975 generic.go:334] "Generic (PLEG): container finished" podID="698cd02e-0279-4ae7-be21-bd479b2dfe49" containerID="66882b365489661784acf85a1b572a55c82cb741d29067dc33e6e382120722a8" exitCode=0 Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.202334 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dswgh" event={"ID":"698cd02e-0279-4ae7-be21-bd479b2dfe49","Type":"ContainerDied","Data":"66882b365489661784acf85a1b572a55c82cb741d29067dc33e6e382120722a8"} Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.209971 4975 generic.go:334] "Generic (PLEG): container finished" podID="cedb84ff-ae71-4210-8f65-16441f4292ac" containerID="272acefdc7ed866baa4e1e025f29c33c8afc340cf50b83c0af293d18751c83ec" exitCode=0 Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.210781 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbw76" event={"ID":"cedb84ff-ae71-4210-8f65-16441f4292ac","Type":"ContainerDied","Data":"272acefdc7ed866baa4e1e025f29c33c8afc340cf50b83c0af293d18751c83ec"} Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.371322 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nlw8k" Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.655429 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.682584 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0f75321-eeb6-4425-9bab-830f0cab1197-kubelet-dir\") pod \"e0f75321-eeb6-4425-9bab-830f0cab1197\" (UID: \"e0f75321-eeb6-4425-9bab-830f0cab1197\") " Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.682640 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0f75321-eeb6-4425-9bab-830f0cab1197-kube-api-access\") pod \"e0f75321-eeb6-4425-9bab-830f0cab1197\" (UID: \"e0f75321-eeb6-4425-9bab-830f0cab1197\") " Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.682923 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0f75321-eeb6-4425-9bab-830f0cab1197-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e0f75321-eeb6-4425-9bab-830f0cab1197" (UID: "e0f75321-eeb6-4425-9bab-830f0cab1197"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.683077 4975 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0f75321-eeb6-4425-9bab-830f0cab1197-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.691277 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f75321-eeb6-4425-9bab-830f0cab1197-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e0f75321-eeb6-4425-9bab-830f0cab1197" (UID: "e0f75321-eeb6-4425-9bab-830f0cab1197"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.794142 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0f75321-eeb6-4425-9bab-830f0cab1197-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.921144 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.996290 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/177a1e9c-68a3-4d50-b462-5d680696d8c7-kube-api-access\") pod \"177a1e9c-68a3-4d50-b462-5d680696d8c7\" (UID: \"177a1e9c-68a3-4d50-b462-5d680696d8c7\") " Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.996400 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/177a1e9c-68a3-4d50-b462-5d680696d8c7-kubelet-dir\") pod \"177a1e9c-68a3-4d50-b462-5d680696d8c7\" (UID: \"177a1e9c-68a3-4d50-b462-5d680696d8c7\") " Mar 18 12:14:06 crc kubenswrapper[4975]: I0318 12:14:06.996739 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/177a1e9c-68a3-4d50-b462-5d680696d8c7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "177a1e9c-68a3-4d50-b462-5d680696d8c7" (UID: "177a1e9c-68a3-4d50-b462-5d680696d8c7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:14:07 crc kubenswrapper[4975]: I0318 12:14:07.000032 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/177a1e9c-68a3-4d50-b462-5d680696d8c7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "177a1e9c-68a3-4d50-b462-5d680696d8c7" (UID: "177a1e9c-68a3-4d50-b462-5d680696d8c7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:07 crc kubenswrapper[4975]: I0318 12:14:07.097899 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/177a1e9c-68a3-4d50-b462-5d680696d8c7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:07 crc kubenswrapper[4975]: I0318 12:14:07.097929 4975 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/177a1e9c-68a3-4d50-b462-5d680696d8c7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:07 crc kubenswrapper[4975]: I0318 12:14:07.103823 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:14:07 crc kubenswrapper[4975]: I0318 12:14:07.108033 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-78ldn" Mar 18 12:14:07 crc kubenswrapper[4975]: I0318 12:14:07.248442 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"177a1e9c-68a3-4d50-b462-5d680696d8c7","Type":"ContainerDied","Data":"c1c4c4dd6138ce1a8db9d768be8716110c27ea93e4752f074b5efce3e8e7aacb"} Mar 18 12:14:07 crc kubenswrapper[4975]: I0318 12:14:07.248491 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1c4c4dd6138ce1a8db9d768be8716110c27ea93e4752f074b5efce3e8e7aacb" Mar 18 12:14:07 crc kubenswrapper[4975]: I0318 12:14:07.248464 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 12:14:07 crc kubenswrapper[4975]: I0318 12:14:07.266418 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e0f75321-eeb6-4425-9bab-830f0cab1197","Type":"ContainerDied","Data":"df4aa2c0e09b47de6098024018a2107dd69a2262acb2741e90a8196f25b5f059"} Mar 18 12:14:07 crc kubenswrapper[4975]: I0318 12:14:07.266484 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df4aa2c0e09b47de6098024018a2107dd69a2262acb2741e90a8196f25b5f059" Mar 18 12:14:07 crc kubenswrapper[4975]: I0318 12:14:07.266441 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 12:14:08 crc kubenswrapper[4975]: I0318 12:14:08.137362 4975 ???:1] "http: TLS handshake error from 192.168.126.11:49846: no serving certificate available for the kubelet" Mar 18 12:14:08 crc kubenswrapper[4975]: I0318 12:14:08.216169 4975 ???:1] "http: TLS handshake error from 192.168.126.11:49854: no serving certificate available for the kubelet" Mar 18 12:14:10 crc kubenswrapper[4975]: I0318 12:14:10.029470 4975 patch_prober.go:28] interesting pod/console-f9d7485db-z69nv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 18 12:14:10 crc kubenswrapper[4975]: I0318 12:14:10.030236 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-z69nv" podUID="311fa18b-fde1-4390-9682-75c836813f88" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 18 12:14:10 crc kubenswrapper[4975]: I0318 12:14:10.126840 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:14:10 crc kubenswrapper[4975]: I0318 12:14:10.126859 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:14:10 crc kubenswrapper[4975]: I0318 12:14:10.126928 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:14:10 crc kubenswrapper[4975]: I0318 12:14:10.126934 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:14:16 crc kubenswrapper[4975]: I0318 12:14:16.356925 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs\") pod \"network-metrics-daemon-587nk\" (UID: \"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\") " pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:14:16 crc kubenswrapper[4975]: I0318 12:14:16.360009 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 12:14:16 crc kubenswrapper[4975]: I0318 12:14:16.381328 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b-metrics-certs\") pod \"network-metrics-daemon-587nk\" (UID: \"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b\") " pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:14:16 crc kubenswrapper[4975]: I0318 12:14:16.430821 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 12:14:16 crc kubenswrapper[4975]: I0318 12:14:16.439044 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-587nk" Mar 18 12:14:18 crc kubenswrapper[4975]: I0318 12:14:18.172422 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b6c454647-vr5vq"] Mar 18 12:14:18 crc kubenswrapper[4975]: I0318 12:14:18.172963 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" podUID="f847fa98-6ca4-4087-aeb0-9d70fab215f0" containerName="controller-manager" containerID="cri-o://a993ebb937fd3971825cf2f27423738e063975bc1dedc75f9aed7c8e3c46ac5f" gracePeriod=30 Mar 18 12:14:18 crc kubenswrapper[4975]: I0318 12:14:18.189189 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps"] Mar 18 12:14:18 crc kubenswrapper[4975]: I0318 12:14:18.189397 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" podUID="d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9" containerName="route-controller-manager" containerID="cri-o://75463ea76d7799d31baf3a374fa726ce8ad6a23bbd621fb5d33f307c926355a6" gracePeriod=30 Mar 18 12:14:18 crc kubenswrapper[4975]: I0318 12:14:18.403078 4975 ???:1] "http: TLS handshake error from 192.168.126.11:54156: no serving certificate available for the kubelet" Mar 18 12:14:19 crc kubenswrapper[4975]: I0318 12:14:19.619003 4975 generic.go:334] "Generic (PLEG): container finished" podID="d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9" containerID="75463ea76d7799d31baf3a374fa726ce8ad6a23bbd621fb5d33f307c926355a6" exitCode=0 Mar 18 12:14:19 crc kubenswrapper[4975]: I0318 12:14:19.619377 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" event={"ID":"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9","Type":"ContainerDied","Data":"75463ea76d7799d31baf3a374fa726ce8ad6a23bbd621fb5d33f307c926355a6"} Mar 18 12:14:19 crc kubenswrapper[4975]: I0318 12:14:19.621338 4975 generic.go:334] "Generic (PLEG): container finished" podID="f847fa98-6ca4-4087-aeb0-9d70fab215f0" containerID="a993ebb937fd3971825cf2f27423738e063975bc1dedc75f9aed7c8e3c46ac5f" exitCode=0 Mar 18 12:14:19 crc kubenswrapper[4975]: I0318 12:14:19.621398 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" event={"ID":"f847fa98-6ca4-4087-aeb0-9d70fab215f0","Type":"ContainerDied","Data":"a993ebb937fd3971825cf2f27423738e063975bc1dedc75f9aed7c8e3c46ac5f"} Mar 18 12:14:20 crc kubenswrapper[4975]: I0318 12:14:20.032430 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:14:20 crc kubenswrapper[4975]: I0318 12:14:20.036585 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:14:20 crc kubenswrapper[4975]: I0318 12:14:20.126683 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:14:20 crc kubenswrapper[4975]: I0318 12:14:20.126692 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:14:20 crc kubenswrapper[4975]: I0318 12:14:20.126725 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:14:20 crc kubenswrapper[4975]: I0318 12:14:20.126740 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:14:20 crc kubenswrapper[4975]: I0318 12:14:20.126761 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-d4zht" Mar 18 12:14:20 crc kubenswrapper[4975]: I0318 12:14:20.127428 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"398546115ff367773c0f5d3553ce683ad2bb9966a9f2ff100021276bf6edf54e"} pod="openshift-console/downloads-7954f5f757-d4zht" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 18 12:14:20 crc kubenswrapper[4975]: I0318 12:14:20.127465 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" containerID="cri-o://398546115ff367773c0f5d3553ce683ad2bb9966a9f2ff100021276bf6edf54e" gracePeriod=2 Mar 18 12:14:20 crc kubenswrapper[4975]: I0318 12:14:20.127564 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:14:20 crc kubenswrapper[4975]: I0318 12:14:20.127588 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:14:20 crc kubenswrapper[4975]: I0318 12:14:20.191787 4975 patch_prober.go:28] interesting pod/route-controller-manager-df6f7fc74-72mps container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 18 12:14:20 crc kubenswrapper[4975]: I0318 12:14:20.191850 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" podUID="d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 18 12:14:21 crc kubenswrapper[4975]: I0318 12:14:21.633903 4975 generic.go:334] "Generic (PLEG): container finished" podID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerID="398546115ff367773c0f5d3553ce683ad2bb9966a9f2ff100021276bf6edf54e" exitCode=0 Mar 18 12:14:21 crc kubenswrapper[4975]: I0318 12:14:21.633948 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d4zht" event={"ID":"36e42dcf-4953-46b4-8459-e2e72e03895c","Type":"ContainerDied","Data":"398546115ff367773c0f5d3553ce683ad2bb9966a9f2ff100021276bf6edf54e"} Mar 18 12:14:21 crc kubenswrapper[4975]: I0318 12:14:21.747030 4975 patch_prober.go:28] interesting pod/controller-manager-7b6c454647-vr5vq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 18 12:14:21 crc kubenswrapper[4975]: I0318 12:14:21.747077 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" podUID="f847fa98-6ca4-4087-aeb0-9d70fab215f0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 18 12:14:21 crc kubenswrapper[4975]: I0318 12:14:21.928572 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:14:25 crc kubenswrapper[4975]: I0318 12:14:25.539084 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:14:25 crc kubenswrapper[4975]: I0318 12:14:25.539148 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:14:29 crc kubenswrapper[4975]: E0318 12:14:29.156365 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 12:14:29 crc kubenswrapper[4975]: E0318 12:14:29.156971 4975 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:14:29 crc kubenswrapper[4975]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 12:14:29 crc kubenswrapper[4975]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-glfp7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29563934-6j5kv_openshift-infra(9d72ac7c-4ce8-4d23-a845-d359bca0544a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 12:14:29 crc kubenswrapper[4975]: > logger="UnhandledError" Mar 18 12:14:29 crc kubenswrapper[4975]: E0318 12:14:29.158144 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29563934-6j5kv" podUID="9d72ac7c-4ce8-4d23-a845-d359bca0544a" Mar 18 12:14:29 crc kubenswrapper[4975]: E0318 12:14:29.166795 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 18 12:14:29 crc kubenswrapper[4975]: E0318 12:14:29.166990 4975 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:14:29 crc kubenswrapper[4975]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 18 12:14:29 crc kubenswrapper[4975]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r9hzv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29563932-9m82r_openshift-infra(bb5204e4-c110-485b-8627-807fdb7f4c27): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 18 12:14:29 crc kubenswrapper[4975]: > logger="UnhandledError" Mar 18 12:14:29 crc kubenswrapper[4975]: E0318 12:14:29.168332 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29563932-9m82r" podUID="bb5204e4-c110-485b-8627-807fdb7f4c27" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.177239 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.184123 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.204702 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh"] Mar 18 12:14:29 crc kubenswrapper[4975]: E0318 12:14:29.205177 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f847fa98-6ca4-4087-aeb0-9d70fab215f0" containerName="controller-manager" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.205189 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f847fa98-6ca4-4087-aeb0-9d70fab215f0" containerName="controller-manager" Mar 18 12:14:29 crc kubenswrapper[4975]: E0318 12:14:29.205198 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177a1e9c-68a3-4d50-b462-5d680696d8c7" containerName="pruner" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.205204 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="177a1e9c-68a3-4d50-b462-5d680696d8c7" containerName="pruner" Mar 18 12:14:29 crc kubenswrapper[4975]: E0318 12:14:29.205213 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9" containerName="route-controller-manager" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.205220 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9" containerName="route-controller-manager" Mar 18 12:14:29 crc kubenswrapper[4975]: E0318 12:14:29.205234 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f75321-eeb6-4425-9bab-830f0cab1197" containerName="pruner" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.205241 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f75321-eeb6-4425-9bab-830f0cab1197" containerName="pruner" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.205329 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f847fa98-6ca4-4087-aeb0-9d70fab215f0" containerName="controller-manager" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.205341 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9" containerName="route-controller-manager" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.205351 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="177a1e9c-68a3-4d50-b462-5d680696d8c7" containerName="pruner" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.205359 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0f75321-eeb6-4425-9bab-830f0cab1197" containerName="pruner" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.205674 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.214002 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh"] Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.333393 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-config\") pod \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.333446 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-serving-cert\") pod \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.333468 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-proxy-ca-bundles\") pod \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.333506 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-config\") pod \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.333533 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v4kn\" (UniqueName: \"kubernetes.io/projected/f847fa98-6ca4-4087-aeb0-9d70fab215f0-kube-api-access-5v4kn\") pod \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.333579 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f847fa98-6ca4-4087-aeb0-9d70fab215f0-serving-cert\") pod \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.333640 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-client-ca\") pod \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\" (UID: \"f847fa98-6ca4-4087-aeb0-9d70fab215f0\") " Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.333656 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mkbv\" (UniqueName: \"kubernetes.io/projected/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-kube-api-access-7mkbv\") pod \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.333684 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-client-ca\") pod \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\" (UID: \"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9\") " Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.333799 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76mbc\" (UniqueName: \"kubernetes.io/projected/98f45bcd-0bac-4511-a12d-edbca0a9881a-kube-api-access-76mbc\") pod \"route-controller-manager-647ff7cbff-swxjh\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.333823 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98f45bcd-0bac-4511-a12d-edbca0a9881a-serving-cert\") pod \"route-controller-manager-647ff7cbff-swxjh\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.333959 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98f45bcd-0bac-4511-a12d-edbca0a9881a-client-ca\") pod \"route-controller-manager-647ff7cbff-swxjh\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.333980 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f45bcd-0bac-4511-a12d-edbca0a9881a-config\") pod \"route-controller-manager-647ff7cbff-swxjh\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.335357 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-client-ca" (OuterVolumeSpecName: "client-ca") pod "f847fa98-6ca4-4087-aeb0-9d70fab215f0" (UID: "f847fa98-6ca4-4087-aeb0-9d70fab215f0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.335429 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9" (UID: "d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.335560 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-config" (OuterVolumeSpecName: "config") pod "f847fa98-6ca4-4087-aeb0-9d70fab215f0" (UID: "f847fa98-6ca4-4087-aeb0-9d70fab215f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.335437 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-config" (OuterVolumeSpecName: "config") pod "d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9" (UID: "d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.335711 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f847fa98-6ca4-4087-aeb0-9d70fab215f0" (UID: "f847fa98-6ca4-4087-aeb0-9d70fab215f0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.341408 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9" (UID: "d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.341459 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-kube-api-access-7mkbv" (OuterVolumeSpecName: "kube-api-access-7mkbv") pod "d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9" (UID: "d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9"). InnerVolumeSpecName "kube-api-access-7mkbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.341532 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f847fa98-6ca4-4087-aeb0-9d70fab215f0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f847fa98-6ca4-4087-aeb0-9d70fab215f0" (UID: "f847fa98-6ca4-4087-aeb0-9d70fab215f0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.342732 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f847fa98-6ca4-4087-aeb0-9d70fab215f0-kube-api-access-5v4kn" (OuterVolumeSpecName: "kube-api-access-5v4kn") pod "f847fa98-6ca4-4087-aeb0-9d70fab215f0" (UID: "f847fa98-6ca4-4087-aeb0-9d70fab215f0"). InnerVolumeSpecName "kube-api-access-5v4kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.435175 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98f45bcd-0bac-4511-a12d-edbca0a9881a-client-ca\") pod \"route-controller-manager-647ff7cbff-swxjh\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.435233 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f45bcd-0bac-4511-a12d-edbca0a9881a-config\") pod \"route-controller-manager-647ff7cbff-swxjh\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.435275 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76mbc\" (UniqueName: \"kubernetes.io/projected/98f45bcd-0bac-4511-a12d-edbca0a9881a-kube-api-access-76mbc\") pod \"route-controller-manager-647ff7cbff-swxjh\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.435305 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98f45bcd-0bac-4511-a12d-edbca0a9881a-serving-cert\") pod \"route-controller-manager-647ff7cbff-swxjh\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.435538 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mkbv\" (UniqueName: \"kubernetes.io/projected/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-kube-api-access-7mkbv\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.435653 4975 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.435693 4975 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.435707 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.435718 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.435731 4975 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.435745 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f847fa98-6ca4-4087-aeb0-9d70fab215f0-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.435757 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v4kn\" (UniqueName: \"kubernetes.io/projected/f847fa98-6ca4-4087-aeb0-9d70fab215f0-kube-api-access-5v4kn\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.435768 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f847fa98-6ca4-4087-aeb0-9d70fab215f0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.436206 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98f45bcd-0bac-4511-a12d-edbca0a9881a-client-ca\") pod \"route-controller-manager-647ff7cbff-swxjh\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.436592 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f45bcd-0bac-4511-a12d-edbca0a9881a-config\") pod \"route-controller-manager-647ff7cbff-swxjh\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.442076 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98f45bcd-0bac-4511-a12d-edbca0a9881a-serving-cert\") pod \"route-controller-manager-647ff7cbff-swxjh\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.453401 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76mbc\" (UniqueName: \"kubernetes.io/projected/98f45bcd-0bac-4511-a12d-edbca0a9881a-kube-api-access-76mbc\") pod \"route-controller-manager-647ff7cbff-swxjh\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.533311 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.680656 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" event={"ID":"f847fa98-6ca4-4087-aeb0-9d70fab215f0","Type":"ContainerDied","Data":"50faa0adf2a7e4c92f2e61a67a74ea0c4fcb9e52c98710cbd25f3fe273b6794f"} Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.680681 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b6c454647-vr5vq" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.680710 4975 scope.go:117] "RemoveContainer" containerID="a993ebb937fd3971825cf2f27423738e063975bc1dedc75f9aed7c8e3c46ac5f" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.690070 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.690059 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps" event={"ID":"d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9","Type":"ContainerDied","Data":"fcb99da4df98066051a5f9a808f61feab55d5deddcc012a75ba3288c71ca5ea7"} Mar 18 12:14:29 crc kubenswrapper[4975]: E0318 12:14:29.691519 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29563932-9m82r" podUID="bb5204e4-c110-485b-8627-807fdb7f4c27" Mar 18 12:14:29 crc kubenswrapper[4975]: E0318 12:14:29.691609 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29563934-6j5kv" podUID="9d72ac7c-4ce8-4d23-a845-d359bca0544a" Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.733353 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b6c454647-vr5vq"] Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.739676 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b6c454647-vr5vq"] Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.742316 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps"] Mar 18 12:14:29 crc kubenswrapper[4975]: I0318 12:14:29.744188 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-df6f7fc74-72mps"] Mar 18 12:14:30 crc kubenswrapper[4975]: I0318 12:14:30.128103 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:14:30 crc kubenswrapper[4975]: I0318 12:14:30.128169 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.029242 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9" path="/var/lib/kubelet/pods/d2da51a0-f3f7-4bad-a51c-c45e69dd7dc9/volumes" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.029846 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f847fa98-6ca4-4087-aeb0-9d70fab215f0" path="/var/lib/kubelet/pods/f847fa98-6ca4-4087-aeb0-9d70fab215f0/volumes" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.231847 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6sclc" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.335959 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-644bdff7d5-x4mqz"] Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.336754 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.346996 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-644bdff7d5-x4mqz"] Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.347679 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.348755 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.349460 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.350352 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.359057 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.359950 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.366285 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.492881 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-config\") pod \"controller-manager-644bdff7d5-x4mqz\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.492943 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-proxy-ca-bundles\") pod \"controller-manager-644bdff7d5-x4mqz\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.492988 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4npsx\" (UniqueName: \"kubernetes.io/projected/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-kube-api-access-4npsx\") pod \"controller-manager-644bdff7d5-x4mqz\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.493066 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-serving-cert\") pod \"controller-manager-644bdff7d5-x4mqz\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.493129 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-client-ca\") pod \"controller-manager-644bdff7d5-x4mqz\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.593784 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-config\") pod \"controller-manager-644bdff7d5-x4mqz\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.593834 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-proxy-ca-bundles\") pod \"controller-manager-644bdff7d5-x4mqz\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.593853 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4npsx\" (UniqueName: \"kubernetes.io/projected/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-kube-api-access-4npsx\") pod \"controller-manager-644bdff7d5-x4mqz\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.593898 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-serving-cert\") pod \"controller-manager-644bdff7d5-x4mqz\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.593939 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-client-ca\") pod \"controller-manager-644bdff7d5-x4mqz\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.595117 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-proxy-ca-bundles\") pod \"controller-manager-644bdff7d5-x4mqz\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.595296 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-config\") pod \"controller-manager-644bdff7d5-x4mqz\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.595356 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-client-ca\") pod \"controller-manager-644bdff7d5-x4mqz\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.604553 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-serving-cert\") pod \"controller-manager-644bdff7d5-x4mqz\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.613465 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4npsx\" (UniqueName: \"kubernetes.io/projected/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-kube-api-access-4npsx\") pod \"controller-manager-644bdff7d5-x4mqz\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:31 crc kubenswrapper[4975]: I0318 12:14:31.673429 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:32 crc kubenswrapper[4975]: I0318 12:14:32.959779 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-587nk"] Mar 18 12:14:36 crc kubenswrapper[4975]: I0318 12:14:36.314794 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 12:14:36 crc kubenswrapper[4975]: I0318 12:14:36.316016 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:36 crc kubenswrapper[4975]: I0318 12:14:36.323383 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 12:14:36 crc kubenswrapper[4975]: I0318 12:14:36.330097 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 12:14:36 crc kubenswrapper[4975]: I0318 12:14:36.331960 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 12:14:36 crc kubenswrapper[4975]: I0318 12:14:36.378155 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3195da11-9671-43d2-9504-e2dc68c2169c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3195da11-9671-43d2-9504-e2dc68c2169c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:36 crc kubenswrapper[4975]: I0318 12:14:36.378205 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3195da11-9671-43d2-9504-e2dc68c2169c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3195da11-9671-43d2-9504-e2dc68c2169c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:36 crc kubenswrapper[4975]: I0318 12:14:36.480421 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3195da11-9671-43d2-9504-e2dc68c2169c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3195da11-9671-43d2-9504-e2dc68c2169c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:36 crc kubenswrapper[4975]: I0318 12:14:36.480483 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3195da11-9671-43d2-9504-e2dc68c2169c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3195da11-9671-43d2-9504-e2dc68c2169c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:36 crc kubenswrapper[4975]: I0318 12:14:36.480566 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3195da11-9671-43d2-9504-e2dc68c2169c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3195da11-9671-43d2-9504-e2dc68c2169c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:36 crc kubenswrapper[4975]: I0318 12:14:36.504749 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3195da11-9671-43d2-9504-e2dc68c2169c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3195da11-9671-43d2-9504-e2dc68c2169c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:36 crc kubenswrapper[4975]: I0318 12:14:36.649202 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:38 crc kubenswrapper[4975]: I0318 12:14:38.172223 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-644bdff7d5-x4mqz"] Mar 18 12:14:38 crc kubenswrapper[4975]: I0318 12:14:38.272958 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh"] Mar 18 12:14:39 crc kubenswrapper[4975]: E0318 12:14:39.499213 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 12:14:39 crc kubenswrapper[4975]: E0318 12:14:39.499623 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qr6ms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-94vnq_openshift-marketplace(9002f360-1ea5-4b24-a49a-69f46a658936): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:14:39 crc kubenswrapper[4975]: E0318 12:14:39.500894 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-94vnq" podUID="9002f360-1ea5-4b24-a49a-69f46a658936" Mar 18 12:14:40 crc kubenswrapper[4975]: I0318 12:14:40.127030 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:14:40 crc kubenswrapper[4975]: I0318 12:14:40.127086 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:14:40 crc kubenswrapper[4975]: E0318 12:14:40.876021 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 12:14:40 crc kubenswrapper[4975]: E0318 12:14:40.876181 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bk82v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hprxg_openshift-marketplace(a7a76930-86ba-4055-85e0-6053832da1aa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:14:40 crc kubenswrapper[4975]: E0318 12:14:40.877582 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hprxg" podUID="a7a76930-86ba-4055-85e0-6053832da1aa" Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.119469 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.120314 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.125841 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.159513 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/975a4029-93d2-4e67-ab56-85d09a7af50b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"975a4029-93d2-4e67-ab56-85d09a7af50b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.159630 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/975a4029-93d2-4e67-ab56-85d09a7af50b-kube-api-access\") pod \"installer-9-crc\" (UID: \"975a4029-93d2-4e67-ab56-85d09a7af50b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.159673 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/975a4029-93d2-4e67-ab56-85d09a7af50b-var-lock\") pod \"installer-9-crc\" (UID: \"975a4029-93d2-4e67-ab56-85d09a7af50b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:42 crc kubenswrapper[4975]: W0318 12:14:42.250099 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6fd4bee_3c4b_4df5_a0c4_1f0cf464173b.slice/crio-2e7bb611548fc0a811486feeeb09b92fc7b39e4a2ab4e74541aa6bf66cc4ac90 WatchSource:0}: Error finding container 2e7bb611548fc0a811486feeeb09b92fc7b39e4a2ab4e74541aa6bf66cc4ac90: Status 404 returned error can't find the container with id 2e7bb611548fc0a811486feeeb09b92fc7b39e4a2ab4e74541aa6bf66cc4ac90 Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.253376 4975 scope.go:117] "RemoveContainer" containerID="75463ea76d7799d31baf3a374fa726ce8ad6a23bbd621fb5d33f307c926355a6" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.253665 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-94vnq" podUID="9002f360-1ea5-4b24-a49a-69f46a658936" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.253719 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hprxg" podUID="a7a76930-86ba-4055-85e0-6053832da1aa" Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.261671 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/975a4029-93d2-4e67-ab56-85d09a7af50b-var-lock\") pod \"installer-9-crc\" (UID: \"975a4029-93d2-4e67-ab56-85d09a7af50b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.261785 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/975a4029-93d2-4e67-ab56-85d09a7af50b-var-lock\") pod \"installer-9-crc\" (UID: \"975a4029-93d2-4e67-ab56-85d09a7af50b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.261822 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/975a4029-93d2-4e67-ab56-85d09a7af50b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"975a4029-93d2-4e67-ab56-85d09a7af50b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.261898 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/975a4029-93d2-4e67-ab56-85d09a7af50b-kube-api-access\") pod \"installer-9-crc\" (UID: \"975a4029-93d2-4e67-ab56-85d09a7af50b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.261914 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/975a4029-93d2-4e67-ab56-85d09a7af50b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"975a4029-93d2-4e67-ab56-85d09a7af50b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.284637 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/975a4029-93d2-4e67-ab56-85d09a7af50b-kube-api-access\") pod \"installer-9-crc\" (UID: \"975a4029-93d2-4e67-ab56-85d09a7af50b\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.295778 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.295957 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-66pqw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-f75bb_openshift-marketplace(46171d59-3549-4843-b6eb-07b9eecd2560): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.297125 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-f75bb" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.318081 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.318326 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8hpnd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-whr68_openshift-marketplace(21b9dc77-7653-4684-ba67-cece256c42e2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.320817 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-whr68" podUID="21b9dc77-7653-4684-ba67-cece256c42e2" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.343127 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.343345 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wlsx7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-c5xm6_openshift-marketplace(94a0b37e-4423-421c-910e-658cb59e08c8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.345342 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-c5xm6" podUID="94a0b37e-4423-421c-910e-658cb59e08c8" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.355322 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.355480 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6sk2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xnfcw_openshift-marketplace(3b24c4ea-1b55-429c-97f5-376523ea1a52): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.356659 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-xnfcw" podUID="3b24c4ea-1b55-429c-97f5-376523ea1a52" Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.440037 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.592244 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-644bdff7d5-x4mqz"] Mar 18 12:14:42 crc kubenswrapper[4975]: W0318 12:14:42.601378 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fc43c12_6dcc_4aef_81d0_8ab230ec77da.slice/crio-2232b19c1e35346ec160697ba26ba535fefe657ae2bd7f3ebbf8c7082b48b550 WatchSource:0}: Error finding container 2232b19c1e35346ec160697ba26ba535fefe657ae2bd7f3ebbf8c7082b48b550: Status 404 returned error can't find the container with id 2232b19c1e35346ec160697ba26ba535fefe657ae2bd7f3ebbf8c7082b48b550 Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.677479 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 12:14:42 crc kubenswrapper[4975]: W0318 12:14:42.695215 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3195da11_9671_43d2_9504_e2dc68c2169c.slice/crio-e473c392b676ccb97d88ebdea27e40b39e85271cef26e8cebff9e479573346af WatchSource:0}: Error finding container e473c392b676ccb97d88ebdea27e40b39e85271cef26e8cebff9e479573346af: Status 404 returned error can't find the container with id e473c392b676ccb97d88ebdea27e40b39e85271cef26e8cebff9e479573346af Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.740235 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh"] Mar 18 12:14:42 crc kubenswrapper[4975]: W0318 12:14:42.747997 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98f45bcd_0bac_4511_a12d_edbca0a9881a.slice/crio-8feb57a991c3bd86e2e1fe163ec5bdf02060147086f27c5eaed271613a58dc58 WatchSource:0}: Error finding container 8feb57a991c3bd86e2e1fe163ec5bdf02060147086f27c5eaed271613a58dc58: Status 404 returned error can't find the container with id 8feb57a991c3bd86e2e1fe163ec5bdf02060147086f27c5eaed271613a58dc58 Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.778335 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.782813 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dswgh" event={"ID":"698cd02e-0279-4ae7-be21-bd479b2dfe49","Type":"ContainerStarted","Data":"65739e4ceb2b7b61050d376d6ee277b0f2692f1d7f769c118219e02df86c2127"} Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.787631 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3195da11-9671-43d2-9504-e2dc68c2169c","Type":"ContainerStarted","Data":"e473c392b676ccb97d88ebdea27e40b39e85271cef26e8cebff9e479573346af"} Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.789826 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" event={"ID":"3fc43c12-6dcc-4aef-81d0-8ab230ec77da","Type":"ContainerStarted","Data":"2232b19c1e35346ec160697ba26ba535fefe657ae2bd7f3ebbf8c7082b48b550"} Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.791978 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" event={"ID":"98f45bcd-0bac-4511-a12d-edbca0a9881a","Type":"ContainerStarted","Data":"8feb57a991c3bd86e2e1fe163ec5bdf02060147086f27c5eaed271613a58dc58"} Mar 18 12:14:42 crc kubenswrapper[4975]: W0318 12:14:42.792122 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod975a4029_93d2_4e67_ab56_85d09a7af50b.slice/crio-8f2dc49430fdac740e898200bfc611b95166bdb526b422e713cd2718d9f41658 WatchSource:0}: Error finding container 8f2dc49430fdac740e898200bfc611b95166bdb526b422e713cd2718d9f41658: Status 404 returned error can't find the container with id 8f2dc49430fdac740e898200bfc611b95166bdb526b422e713cd2718d9f41658 Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.798345 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d4zht" event={"ID":"36e42dcf-4953-46b4-8459-e2e72e03895c","Type":"ContainerStarted","Data":"d015e003159e52edfaa2e25e570cfb630a25ba6865cefdb689fba44c4e35929b"} Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.799340 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-d4zht" Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.799890 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.799924 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.801707 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbw76" event={"ID":"cedb84ff-ae71-4210-8f65-16441f4292ac","Type":"ContainerStarted","Data":"06a129fa931062672542451d6f75d48e0d1653eb72af6a6fef5789e3fcf61d38"} Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.809907 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-587nk" event={"ID":"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b","Type":"ContainerStarted","Data":"3b66fdb4fd43e57839df01d9493e74a8252dbfb8841077381c1f1323ce989522"} Mar 18 12:14:42 crc kubenswrapper[4975]: I0318 12:14:42.810001 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-587nk" event={"ID":"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b","Type":"ContainerStarted","Data":"2e7bb611548fc0a811486feeeb09b92fc7b39e4a2ab4e74541aa6bf66cc4ac90"} Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.813246 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-f75bb" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.815993 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-whr68" podUID="21b9dc77-7653-4684-ba67-cece256c42e2" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.817335 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-c5xm6" podUID="94a0b37e-4423-421c-910e-658cb59e08c8" Mar 18 12:14:42 crc kubenswrapper[4975]: E0318 12:14:42.867304 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xnfcw" podUID="3b24c4ea-1b55-429c-97f5-376523ea1a52" Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.816628 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-587nk" event={"ID":"a6fd4bee-3c4b-4df5-a0c4-1f0cf464173b","Type":"ContainerStarted","Data":"7f747aa9bdacc097c57191f32b8b75e8b16ca63997fbf1acb805e423ed59d63d"} Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.819846 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" event={"ID":"98f45bcd-0bac-4511-a12d-edbca0a9881a","Type":"ContainerStarted","Data":"f7d64a638279fb61a8ab905254f821a1aaa57f4742f13f4a162d0e6af9e62ec9"} Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.819981 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" podUID="98f45bcd-0bac-4511-a12d-edbca0a9881a" containerName="route-controller-manager" containerID="cri-o://f7d64a638279fb61a8ab905254f821a1aaa57f4742f13f4a162d0e6af9e62ec9" gracePeriod=30 Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.820567 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.822640 4975 generic.go:334] "Generic (PLEG): container finished" podID="698cd02e-0279-4ae7-be21-bd479b2dfe49" containerID="65739e4ceb2b7b61050d376d6ee277b0f2692f1d7f769c118219e02df86c2127" exitCode=0 Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.822722 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dswgh" event={"ID":"698cd02e-0279-4ae7-be21-bd479b2dfe49","Type":"ContainerDied","Data":"65739e4ceb2b7b61050d376d6ee277b0f2692f1d7f769c118219e02df86c2127"} Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.832438 4975 generic.go:334] "Generic (PLEG): container finished" podID="cedb84ff-ae71-4210-8f65-16441f4292ac" containerID="06a129fa931062672542451d6f75d48e0d1653eb72af6a6fef5789e3fcf61d38" exitCode=0 Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.833551 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbw76" event={"ID":"cedb84ff-ae71-4210-8f65-16441f4292ac","Type":"ContainerDied","Data":"06a129fa931062672542451d6f75d48e0d1653eb72af6a6fef5789e3fcf61d38"} Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.835690 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3195da11-9671-43d2-9504-e2dc68c2169c","Type":"ContainerStarted","Data":"56c520ab9affe477ddf28ae1a3f2068c1525b0321fd6e13b93d9a459ed863b49"} Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.839184 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" event={"ID":"3fc43c12-6dcc-4aef-81d0-8ab230ec77da","Type":"ContainerStarted","Data":"c0d874e8271c8d52e87d30e05a0892fb82e4ee76e94c2988b8a940a23080d793"} Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.839266 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" podUID="3fc43c12-6dcc-4aef-81d0-8ab230ec77da" containerName="controller-manager" containerID="cri-o://c0d874e8271c8d52e87d30e05a0892fb82e4ee76e94c2988b8a940a23080d793" gracePeriod=30 Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.839432 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.846607 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"975a4029-93d2-4e67-ab56-85d09a7af50b","Type":"ContainerStarted","Data":"037e9b4692bfceb2ad0c287cb8845e9c3a855199831d1071471606a14e62560c"} Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.846643 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"975a4029-93d2-4e67-ab56-85d09a7af50b","Type":"ContainerStarted","Data":"8f2dc49430fdac740e898200bfc611b95166bdb526b422e713cd2718d9f41658"} Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.852786 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.852959 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.864053 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.870673 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" podStartSLOduration=25.8706537 podStartE2EDuration="25.8706537s" podCreationTimestamp="2026-03-18 12:14:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:43.841575744 +0000 UTC m=+269.555976323" watchObservedRunningTime="2026-03-18 12:14:43.8706537 +0000 UTC m=+269.585054279" Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.904750 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" podStartSLOduration=25.904729764 podStartE2EDuration="25.904729764s" podCreationTimestamp="2026-03-18 12:14:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:43.886433777 +0000 UTC m=+269.600834356" watchObservedRunningTime="2026-03-18 12:14:43.904729764 +0000 UTC m=+269.619130343" Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.948569 4975 patch_prober.go:28] interesting pod/route-controller-manager-647ff7cbff-swxjh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:47862->10.217.0.58:8443: read: connection reset by peer" start-of-body= Mar 18 12:14:43 crc kubenswrapper[4975]: I0318 12:14:43.948629 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" podUID="98f45bcd-0bac-4511-a12d-edbca0a9881a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:47862->10.217.0.58:8443: read: connection reset by peer" Mar 18 12:14:44 crc kubenswrapper[4975]: I0318 12:14:44.853185 4975 generic.go:334] "Generic (PLEG): container finished" podID="3195da11-9671-43d2-9504-e2dc68c2169c" containerID="56c520ab9affe477ddf28ae1a3f2068c1525b0321fd6e13b93d9a459ed863b49" exitCode=0 Mar 18 12:14:44 crc kubenswrapper[4975]: I0318 12:14:44.853252 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3195da11-9671-43d2-9504-e2dc68c2169c","Type":"ContainerDied","Data":"56c520ab9affe477ddf28ae1a3f2068c1525b0321fd6e13b93d9a459ed863b49"} Mar 18 12:14:44 crc kubenswrapper[4975]: I0318 12:14:44.856215 4975 generic.go:334] "Generic (PLEG): container finished" podID="3fc43c12-6dcc-4aef-81d0-8ab230ec77da" containerID="c0d874e8271c8d52e87d30e05a0892fb82e4ee76e94c2988b8a940a23080d793" exitCode=0 Mar 18 12:14:44 crc kubenswrapper[4975]: I0318 12:14:44.856280 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" event={"ID":"3fc43c12-6dcc-4aef-81d0-8ab230ec77da","Type":"ContainerDied","Data":"c0d874e8271c8d52e87d30e05a0892fb82e4ee76e94c2988b8a940a23080d793"} Mar 18 12:14:44 crc kubenswrapper[4975]: I0318 12:14:44.863736 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-647ff7cbff-swxjh_98f45bcd-0bac-4511-a12d-edbca0a9881a/route-controller-manager/0.log" Mar 18 12:14:44 crc kubenswrapper[4975]: I0318 12:14:44.863798 4975 generic.go:334] "Generic (PLEG): container finished" podID="98f45bcd-0bac-4511-a12d-edbca0a9881a" containerID="f7d64a638279fb61a8ab905254f821a1aaa57f4742f13f4a162d0e6af9e62ec9" exitCode=255 Mar 18 12:14:44 crc kubenswrapper[4975]: I0318 12:14:44.864756 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" event={"ID":"98f45bcd-0bac-4511-a12d-edbca0a9881a","Type":"ContainerDied","Data":"f7d64a638279fb61a8ab905254f821a1aaa57f4742f13f4a162d0e6af9e62ec9"} Mar 18 12:14:44 crc kubenswrapper[4975]: I0318 12:14:44.864843 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:14:44 crc kubenswrapper[4975]: I0318 12:14:44.864899 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:14:44 crc kubenswrapper[4975]: I0318 12:14:44.892339 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-587nk" podStartSLOduration=221.892322643 podStartE2EDuration="3m41.892322643s" podCreationTimestamp="2026-03-18 12:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:44.89006292 +0000 UTC m=+270.604463529" watchObservedRunningTime="2026-03-18 12:14:44.892322643 +0000 UTC m=+270.606723222" Mar 18 12:14:44 crc kubenswrapper[4975]: I0318 12:14:44.908642 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.908623274 podStartE2EDuration="2.908623274s" podCreationTimestamp="2026-03-18 12:14:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:44.907423981 +0000 UTC m=+270.621824580" watchObservedRunningTime="2026-03-18 12:14:44.908623274 +0000 UTC m=+270.623023853" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.243816 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.250930 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-647ff7cbff-swxjh_98f45bcd-0bac-4511-a12d-edbca0a9881a/route-controller-manager/0.log" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.250992 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.271298 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w"] Mar 18 12:14:45 crc kubenswrapper[4975]: E0318 12:14:45.272859 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f45bcd-0bac-4511-a12d-edbca0a9881a" containerName="route-controller-manager" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.272890 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f45bcd-0bac-4511-a12d-edbca0a9881a" containerName="route-controller-manager" Mar 18 12:14:45 crc kubenswrapper[4975]: E0318 12:14:45.272912 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc43c12-6dcc-4aef-81d0-8ab230ec77da" containerName="controller-manager" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.272918 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc43c12-6dcc-4aef-81d0-8ab230ec77da" containerName="controller-manager" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.273029 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f45bcd-0bac-4511-a12d-edbca0a9881a" containerName="route-controller-manager" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.273056 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc43c12-6dcc-4aef-81d0-8ab230ec77da" containerName="controller-manager" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.273412 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.285610 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w"] Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.329221 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98f45bcd-0bac-4511-a12d-edbca0a9881a-serving-cert\") pod \"98f45bcd-0bac-4511-a12d-edbca0a9881a\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.329471 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-serving-cert\") pod \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.329549 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-client-ca\") pod \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.329670 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76mbc\" (UniqueName: \"kubernetes.io/projected/98f45bcd-0bac-4511-a12d-edbca0a9881a-kube-api-access-76mbc\") pod \"98f45bcd-0bac-4511-a12d-edbca0a9881a\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.329780 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-config\") pod \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.329907 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98f45bcd-0bac-4511-a12d-edbca0a9881a-client-ca\") pod \"98f45bcd-0bac-4511-a12d-edbca0a9881a\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.330018 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-proxy-ca-bundles\") pod \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.330088 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4npsx\" (UniqueName: \"kubernetes.io/projected/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-kube-api-access-4npsx\") pod \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\" (UID: \"3fc43c12-6dcc-4aef-81d0-8ab230ec77da\") " Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.330164 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f45bcd-0bac-4511-a12d-edbca0a9881a-config\") pod \"98f45bcd-0bac-4511-a12d-edbca0a9881a\" (UID: \"98f45bcd-0bac-4511-a12d-edbca0a9881a\") " Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.330375 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-config\") pod \"controller-manager-75cd8bcbdd-qs87w\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.330481 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01f8de13-7985-44d3-a987-949f63570885-serving-cert\") pod \"controller-manager-75cd8bcbdd-qs87w\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.330573 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk2fc\" (UniqueName: \"kubernetes.io/projected/01f8de13-7985-44d3-a987-949f63570885-kube-api-access-bk2fc\") pod \"controller-manager-75cd8bcbdd-qs87w\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.330692 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-client-ca\") pod \"controller-manager-75cd8bcbdd-qs87w\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.330830 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-proxy-ca-bundles\") pod \"controller-manager-75cd8bcbdd-qs87w\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.330729 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-client-ca" (OuterVolumeSpecName: "client-ca") pod "3fc43c12-6dcc-4aef-81d0-8ab230ec77da" (UID: "3fc43c12-6dcc-4aef-81d0-8ab230ec77da"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.330919 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98f45bcd-0bac-4511-a12d-edbca0a9881a-client-ca" (OuterVolumeSpecName: "client-ca") pod "98f45bcd-0bac-4511-a12d-edbca0a9881a" (UID: "98f45bcd-0bac-4511-a12d-edbca0a9881a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.331446 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3fc43c12-6dcc-4aef-81d0-8ab230ec77da" (UID: "3fc43c12-6dcc-4aef-81d0-8ab230ec77da"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.331518 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-config" (OuterVolumeSpecName: "config") pod "3fc43c12-6dcc-4aef-81d0-8ab230ec77da" (UID: "3fc43c12-6dcc-4aef-81d0-8ab230ec77da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.331903 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98f45bcd-0bac-4511-a12d-edbca0a9881a-config" (OuterVolumeSpecName: "config") pod "98f45bcd-0bac-4511-a12d-edbca0a9881a" (UID: "98f45bcd-0bac-4511-a12d-edbca0a9881a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.338100 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3fc43c12-6dcc-4aef-81d0-8ab230ec77da" (UID: "3fc43c12-6dcc-4aef-81d0-8ab230ec77da"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.338188 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f45bcd-0bac-4511-a12d-edbca0a9881a-kube-api-access-76mbc" (OuterVolumeSpecName: "kube-api-access-76mbc") pod "98f45bcd-0bac-4511-a12d-edbca0a9881a" (UID: "98f45bcd-0bac-4511-a12d-edbca0a9881a"). InnerVolumeSpecName "kube-api-access-76mbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.341513 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-kube-api-access-4npsx" (OuterVolumeSpecName: "kube-api-access-4npsx") pod "3fc43c12-6dcc-4aef-81d0-8ab230ec77da" (UID: "3fc43c12-6dcc-4aef-81d0-8ab230ec77da"). InnerVolumeSpecName "kube-api-access-4npsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.341802 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f45bcd-0bac-4511-a12d-edbca0a9881a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "98f45bcd-0bac-4511-a12d-edbca0a9881a" (UID: "98f45bcd-0bac-4511-a12d-edbca0a9881a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.432441 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-client-ca\") pod \"controller-manager-75cd8bcbdd-qs87w\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.432553 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-proxy-ca-bundles\") pod \"controller-manager-75cd8bcbdd-qs87w\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.432606 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-config\") pod \"controller-manager-75cd8bcbdd-qs87w\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.432624 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01f8de13-7985-44d3-a987-949f63570885-serving-cert\") pod \"controller-manager-75cd8bcbdd-qs87w\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.432662 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk2fc\" (UniqueName: \"kubernetes.io/projected/01f8de13-7985-44d3-a987-949f63570885-kube-api-access-bk2fc\") pod \"controller-manager-75cd8bcbdd-qs87w\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.432708 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.432722 4975 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98f45bcd-0bac-4511-a12d-edbca0a9881a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.432732 4975 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.432743 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4npsx\" (UniqueName: \"kubernetes.io/projected/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-kube-api-access-4npsx\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.432751 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f45bcd-0bac-4511-a12d-edbca0a9881a-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.432759 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98f45bcd-0bac-4511-a12d-edbca0a9881a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.432767 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.432775 4975 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3fc43c12-6dcc-4aef-81d0-8ab230ec77da-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.432783 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76mbc\" (UniqueName: \"kubernetes.io/projected/98f45bcd-0bac-4511-a12d-edbca0a9881a-kube-api-access-76mbc\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.433577 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-client-ca\") pod \"controller-manager-75cd8bcbdd-qs87w\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.433960 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-config\") pod \"controller-manager-75cd8bcbdd-qs87w\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.434490 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-proxy-ca-bundles\") pod \"controller-manager-75cd8bcbdd-qs87w\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.437792 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01f8de13-7985-44d3-a987-949f63570885-serving-cert\") pod \"controller-manager-75cd8bcbdd-qs87w\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.447579 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk2fc\" (UniqueName: \"kubernetes.io/projected/01f8de13-7985-44d3-a987-949f63570885-kube-api-access-bk2fc\") pod \"controller-manager-75cd8bcbdd-qs87w\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.595552 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.870530 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" event={"ID":"3fc43c12-6dcc-4aef-81d0-8ab230ec77da","Type":"ContainerDied","Data":"2232b19c1e35346ec160697ba26ba535fefe657ae2bd7f3ebbf8c7082b48b550"} Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.870577 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-644bdff7d5-x4mqz" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.870585 4975 scope.go:117] "RemoveContainer" containerID="c0d874e8271c8d52e87d30e05a0892fb82e4ee76e94c2988b8a940a23080d793" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.872023 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-647ff7cbff-swxjh_98f45bcd-0bac-4511-a12d-edbca0a9881a/route-controller-manager/0.log" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.872082 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" event={"ID":"98f45bcd-0bac-4511-a12d-edbca0a9881a","Type":"ContainerDied","Data":"8feb57a991c3bd86e2e1fe163ec5bdf02060147086f27c5eaed271613a58dc58"} Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.872129 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.908169 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh"] Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.918246 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-647ff7cbff-swxjh"] Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.923395 4975 scope.go:117] "RemoveContainer" containerID="f7d64a638279fb61a8ab905254f821a1aaa57f4742f13f4a162d0e6af9e62ec9" Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.934082 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-644bdff7d5-x4mqz"] Mar 18 12:14:45 crc kubenswrapper[4975]: I0318 12:14:45.938422 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-644bdff7d5-x4mqz"] Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.071029 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w"] Mar 18 12:14:46 crc kubenswrapper[4975]: W0318 12:14:46.086520 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01f8de13_7985_44d3_a987_949f63570885.slice/crio-ca688add4f90f5318093a5d1203754bbe6e55df99908ee3fde0e50c3368206f4 WatchSource:0}: Error finding container ca688add4f90f5318093a5d1203754bbe6e55df99908ee3fde0e50c3368206f4: Status 404 returned error can't find the container with id ca688add4f90f5318093a5d1203754bbe6e55df99908ee3fde0e50c3368206f4 Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.171215 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.246571 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3195da11-9671-43d2-9504-e2dc68c2169c-kube-api-access\") pod \"3195da11-9671-43d2-9504-e2dc68c2169c\" (UID: \"3195da11-9671-43d2-9504-e2dc68c2169c\") " Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.246648 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3195da11-9671-43d2-9504-e2dc68c2169c-kubelet-dir\") pod \"3195da11-9671-43d2-9504-e2dc68c2169c\" (UID: \"3195da11-9671-43d2-9504-e2dc68c2169c\") " Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.247019 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3195da11-9671-43d2-9504-e2dc68c2169c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3195da11-9671-43d2-9504-e2dc68c2169c" (UID: "3195da11-9671-43d2-9504-e2dc68c2169c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.251992 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3195da11-9671-43d2-9504-e2dc68c2169c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3195da11-9671-43d2-9504-e2dc68c2169c" (UID: "3195da11-9671-43d2-9504-e2dc68c2169c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.348117 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3195da11-9671-43d2-9504-e2dc68c2169c-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.348526 4975 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3195da11-9671-43d2-9504-e2dc68c2169c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.887137 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" event={"ID":"01f8de13-7985-44d3-a987-949f63570885","Type":"ContainerStarted","Data":"3f448903a5761a43f05868a401b39e315a2fb0794e5505dfc66c0ef2208c10b0"} Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.887191 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" event={"ID":"01f8de13-7985-44d3-a987-949f63570885","Type":"ContainerStarted","Data":"ca688add4f90f5318093a5d1203754bbe6e55df99908ee3fde0e50c3368206f4"} Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.887385 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.894203 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3195da11-9671-43d2-9504-e2dc68c2169c","Type":"ContainerDied","Data":"e473c392b676ccb97d88ebdea27e40b39e85271cef26e8cebff9e479573346af"} Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.894248 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e473c392b676ccb97d88ebdea27e40b39e85271cef26e8cebff9e479573346af" Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.894518 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.894944 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:46 crc kubenswrapper[4975]: I0318 12:14:46.939240 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" podStartSLOduration=8.939226857 podStartE2EDuration="8.939226857s" podCreationTimestamp="2026-03-18 12:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:46.914161933 +0000 UTC m=+272.628562512" watchObservedRunningTime="2026-03-18 12:14:46.939226857 +0000 UTC m=+272.653627436" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.022986 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fc43c12-6dcc-4aef-81d0-8ab230ec77da" path="/var/lib/kubelet/pods/3fc43c12-6dcc-4aef-81d0-8ab230ec77da/volumes" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.023554 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f45bcd-0bac-4511-a12d-edbca0a9881a" path="/var/lib/kubelet/pods/98f45bcd-0bac-4511-a12d-edbca0a9881a/volumes" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.350069 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l"] Mar 18 12:14:47 crc kubenswrapper[4975]: E0318 12:14:47.350666 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3195da11-9671-43d2-9504-e2dc68c2169c" containerName="pruner" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.350688 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="3195da11-9671-43d2-9504-e2dc68c2169c" containerName="pruner" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.350810 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="3195da11-9671-43d2-9504-e2dc68c2169c" containerName="pruner" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.351285 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.353354 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.353626 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.353879 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.353910 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.353890 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.354137 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.361064 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l"] Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.464298 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2798fc75-d009-42cb-8bf3-e834d4271024-serving-cert\") pod \"route-controller-manager-76dff5dcbd-rlr6l\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.464478 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2798fc75-d009-42cb-8bf3-e834d4271024-client-ca\") pod \"route-controller-manager-76dff5dcbd-rlr6l\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.464509 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldtpj\" (UniqueName: \"kubernetes.io/projected/2798fc75-d009-42cb-8bf3-e834d4271024-kube-api-access-ldtpj\") pod \"route-controller-manager-76dff5dcbd-rlr6l\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.464634 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2798fc75-d009-42cb-8bf3-e834d4271024-config\") pod \"route-controller-manager-76dff5dcbd-rlr6l\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.566320 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2798fc75-d009-42cb-8bf3-e834d4271024-serving-cert\") pod \"route-controller-manager-76dff5dcbd-rlr6l\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.566434 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2798fc75-d009-42cb-8bf3-e834d4271024-client-ca\") pod \"route-controller-manager-76dff5dcbd-rlr6l\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.566459 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldtpj\" (UniqueName: \"kubernetes.io/projected/2798fc75-d009-42cb-8bf3-e834d4271024-kube-api-access-ldtpj\") pod \"route-controller-manager-76dff5dcbd-rlr6l\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.566510 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2798fc75-d009-42cb-8bf3-e834d4271024-config\") pod \"route-controller-manager-76dff5dcbd-rlr6l\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.567447 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2798fc75-d009-42cb-8bf3-e834d4271024-client-ca\") pod \"route-controller-manager-76dff5dcbd-rlr6l\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.567701 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2798fc75-d009-42cb-8bf3-e834d4271024-config\") pod \"route-controller-manager-76dff5dcbd-rlr6l\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.571922 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2798fc75-d009-42cb-8bf3-e834d4271024-serving-cert\") pod \"route-controller-manager-76dff5dcbd-rlr6l\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.593382 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldtpj\" (UniqueName: \"kubernetes.io/projected/2798fc75-d009-42cb-8bf3-e834d4271024-kube-api-access-ldtpj\") pod \"route-controller-manager-76dff5dcbd-rlr6l\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:47 crc kubenswrapper[4975]: I0318 12:14:47.674479 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:50 crc kubenswrapper[4975]: I0318 12:14:50.128162 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:14:50 crc kubenswrapper[4975]: I0318 12:14:50.128222 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:14:50 crc kubenswrapper[4975]: I0318 12:14:50.129654 4975 patch_prober.go:28] interesting pod/downloads-7954f5f757-d4zht container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 18 12:14:50 crc kubenswrapper[4975]: I0318 12:14:50.129793 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d4zht" podUID="36e42dcf-4953-46b4-8459-e2e72e03895c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 18 12:14:51 crc kubenswrapper[4975]: I0318 12:14:51.913513 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l"] Mar 18 12:14:51 crc kubenswrapper[4975]: I0318 12:14:51.927831 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dswgh" event={"ID":"698cd02e-0279-4ae7-be21-bd479b2dfe49","Type":"ContainerStarted","Data":"1a6b486fc13cc47731c997aa3a78910242ce76adbc3add70cf76274cb96308d0"} Mar 18 12:14:51 crc kubenswrapper[4975]: I0318 12:14:51.930298 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbw76" event={"ID":"cedb84ff-ae71-4210-8f65-16441f4292ac","Type":"ContainerStarted","Data":"b4110a2008d9b23f3c77daa0c5e95cf50a7124e0d5a23942dfc264682aa6312a"} Mar 18 12:14:52 crc kubenswrapper[4975]: I0318 12:14:52.410985 4975 csr.go:261] certificate signing request csr-2lwkn is approved, waiting to be issued Mar 18 12:14:52 crc kubenswrapper[4975]: I0318 12:14:52.417480 4975 csr.go:257] certificate signing request csr-2lwkn is issued Mar 18 12:14:52 crc kubenswrapper[4975]: I0318 12:14:52.937547 4975 generic.go:334] "Generic (PLEG): container finished" podID="9d72ac7c-4ce8-4d23-a845-d359bca0544a" containerID="0b76564616e00fa41708e5f77a628228d8677811e5c8d0d7dc7db75a2883620f" exitCode=0 Mar 18 12:14:52 crc kubenswrapper[4975]: I0318 12:14:52.937648 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563934-6j5kv" event={"ID":"9d72ac7c-4ce8-4d23-a845-d359bca0544a","Type":"ContainerDied","Data":"0b76564616e00fa41708e5f77a628228d8677811e5c8d0d7dc7db75a2883620f"} Mar 18 12:14:52 crc kubenswrapper[4975]: I0318 12:14:52.940330 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" event={"ID":"2798fc75-d009-42cb-8bf3-e834d4271024","Type":"ContainerStarted","Data":"554f7adc17c3f8cc530912b5bee4e32df809d74d9a4d33634850302c542597e0"} Mar 18 12:14:52 crc kubenswrapper[4975]: I0318 12:14:52.940373 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" event={"ID":"2798fc75-d009-42cb-8bf3-e834d4271024","Type":"ContainerStarted","Data":"e96c13f8476f01a8c19f6824c617012ef2a2835561e143d76cea75b1a21b536e"} Mar 18 12:14:52 crc kubenswrapper[4975]: I0318 12:14:52.940511 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:52 crc kubenswrapper[4975]: I0318 12:14:52.942809 4975 generic.go:334] "Generic (PLEG): container finished" podID="bb5204e4-c110-485b-8627-807fdb7f4c27" containerID="995fbfe1ae1d68d3875e30a6a541d70b4b174153aa38e91ecf31cea63b7cffbc" exitCode=0 Mar 18 12:14:52 crc kubenswrapper[4975]: I0318 12:14:52.943462 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563932-9m82r" event={"ID":"bb5204e4-c110-485b-8627-807fdb7f4c27","Type":"ContainerDied","Data":"995fbfe1ae1d68d3875e30a6a541d70b4b174153aa38e91ecf31cea63b7cffbc"} Mar 18 12:14:52 crc kubenswrapper[4975]: I0318 12:14:52.945630 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:52 crc kubenswrapper[4975]: I0318 12:14:52.969614 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lbw76" podStartSLOduration=5.52324505 podStartE2EDuration="49.969592852s" podCreationTimestamp="2026-03-18 12:14:03 +0000 UTC" firstStartedPulling="2026-03-18 12:14:06.211795388 +0000 UTC m=+231.926195967" lastFinishedPulling="2026-03-18 12:14:50.65814319 +0000 UTC m=+276.372543769" observedRunningTime="2026-03-18 12:14:52.968213074 +0000 UTC m=+278.682613673" watchObservedRunningTime="2026-03-18 12:14:52.969592852 +0000 UTC m=+278.683993431" Mar 18 12:14:52 crc kubenswrapper[4975]: I0318 12:14:52.998597 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" podStartSLOduration=14.998576365 podStartE2EDuration="14.998576365s" podCreationTimestamp="2026-03-18 12:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:14:52.996686883 +0000 UTC m=+278.711087462" watchObservedRunningTime="2026-03-18 12:14:52.998576365 +0000 UTC m=+278.712976944" Mar 18 12:14:53 crc kubenswrapper[4975]: I0318 12:14:53.031849 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dswgh" podStartSLOduration=4.827439976 podStartE2EDuration="50.031827876s" podCreationTimestamp="2026-03-18 12:14:03 +0000 UTC" firstStartedPulling="2026-03-18 12:14:06.20458404 +0000 UTC m=+231.918984619" lastFinishedPulling="2026-03-18 12:14:51.40897194 +0000 UTC m=+277.123372519" observedRunningTime="2026-03-18 12:14:53.030906491 +0000 UTC m=+278.745307070" watchObservedRunningTime="2026-03-18 12:14:53.031827876 +0000 UTC m=+278.746228465" Mar 18 12:14:53 crc kubenswrapper[4975]: I0318 12:14:53.419437 4975 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-21 18:31:52.908144162 +0000 UTC Mar 18 12:14:53 crc kubenswrapper[4975]: I0318 12:14:53.419481 4975 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6678h16m59.488665537s for next certificate rotation Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.160855 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.161026 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.319690 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.320018 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.437486 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563934-6j5kv" Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.442476 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563932-9m82r" Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.576074 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glfp7\" (UniqueName: \"kubernetes.io/projected/9d72ac7c-4ce8-4d23-a845-d359bca0544a-kube-api-access-glfp7\") pod \"9d72ac7c-4ce8-4d23-a845-d359bca0544a\" (UID: \"9d72ac7c-4ce8-4d23-a845-d359bca0544a\") " Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.576150 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9hzv\" (UniqueName: \"kubernetes.io/projected/bb5204e4-c110-485b-8627-807fdb7f4c27-kube-api-access-r9hzv\") pod \"bb5204e4-c110-485b-8627-807fdb7f4c27\" (UID: \"bb5204e4-c110-485b-8627-807fdb7f4c27\") " Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.583110 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d72ac7c-4ce8-4d23-a845-d359bca0544a-kube-api-access-glfp7" (OuterVolumeSpecName: "kube-api-access-glfp7") pod "9d72ac7c-4ce8-4d23-a845-d359bca0544a" (UID: "9d72ac7c-4ce8-4d23-a845-d359bca0544a"). InnerVolumeSpecName "kube-api-access-glfp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.583198 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb5204e4-c110-485b-8627-807fdb7f4c27-kube-api-access-r9hzv" (OuterVolumeSpecName: "kube-api-access-r9hzv") pod "bb5204e4-c110-485b-8627-807fdb7f4c27" (UID: "bb5204e4-c110-485b-8627-807fdb7f4c27"). InnerVolumeSpecName "kube-api-access-r9hzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.677636 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9hzv\" (UniqueName: \"kubernetes.io/projected/bb5204e4-c110-485b-8627-807fdb7f4c27-kube-api-access-r9hzv\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.677670 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glfp7\" (UniqueName: \"kubernetes.io/projected/9d72ac7c-4ce8-4d23-a845-d359bca0544a-kube-api-access-glfp7\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.960977 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563932-9m82r" Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.961054 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563932-9m82r" event={"ID":"bb5204e4-c110-485b-8627-807fdb7f4c27","Type":"ContainerDied","Data":"96fd7eedf629658d4a8b223eb781a8a5a7df058e8e45c9131a6056284fa577c6"} Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.961086 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96fd7eedf629658d4a8b223eb781a8a5a7df058e8e45c9131a6056284fa577c6" Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.962951 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563934-6j5kv" event={"ID":"9d72ac7c-4ce8-4d23-a845-d359bca0544a","Type":"ContainerDied","Data":"44a06719371d84506167442d291773036d152b91c6bc025d7ce20b7a99da7621"} Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.962975 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44a06719371d84506167442d291773036d152b91c6bc025d7ce20b7a99da7621" Mar 18 12:14:54 crc kubenswrapper[4975]: I0318 12:14:54.962993 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563934-6j5kv" Mar 18 12:14:55 crc kubenswrapper[4975]: I0318 12:14:55.480619 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lbw76" podUID="cedb84ff-ae71-4210-8f65-16441f4292ac" containerName="registry-server" probeResult="failure" output=< Mar 18 12:14:55 crc kubenswrapper[4975]: timeout: failed to connect service ":50051" within 1s Mar 18 12:14:55 crc kubenswrapper[4975]: > Mar 18 12:14:55 crc kubenswrapper[4975]: I0318 12:14:55.483610 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dswgh" podUID="698cd02e-0279-4ae7-be21-bd479b2dfe49" containerName="registry-server" probeResult="failure" output=< Mar 18 12:14:55 crc kubenswrapper[4975]: timeout: failed to connect service ":50051" within 1s Mar 18 12:14:55 crc kubenswrapper[4975]: > Mar 18 12:14:55 crc kubenswrapper[4975]: I0318 12:14:55.538351 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:14:55 crc kubenswrapper[4975]: I0318 12:14:55.538413 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:14:55 crc kubenswrapper[4975]: I0318 12:14:55.538461 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:14:55 crc kubenswrapper[4975]: I0318 12:14:55.539094 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:14:55 crc kubenswrapper[4975]: I0318 12:14:55.539158 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b" gracePeriod=600 Mar 18 12:14:55 crc kubenswrapper[4975]: I0318 12:14:55.968538 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5xm6" event={"ID":"94a0b37e-4423-421c-910e-658cb59e08c8","Type":"ContainerStarted","Data":"c08ce5c839bdd21969c1d9f12eca76efe2eadfeb126efd13ee6d5fe109762bcf"} Mar 18 12:14:55 crc kubenswrapper[4975]: I0318 12:14:55.971462 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b" exitCode=0 Mar 18 12:14:55 crc kubenswrapper[4975]: I0318 12:14:55.971595 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b"} Mar 18 12:14:55 crc kubenswrapper[4975]: I0318 12:14:55.971690 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"2aa3dffbe2fa58483db177c45dee69be9b0aee41d425f82de7fd39aca38b19a7"} Mar 18 12:14:56 crc kubenswrapper[4975]: I0318 12:14:56.977581 4975 generic.go:334] "Generic (PLEG): container finished" podID="94a0b37e-4423-421c-910e-658cb59e08c8" containerID="c08ce5c839bdd21969c1d9f12eca76efe2eadfeb126efd13ee6d5fe109762bcf" exitCode=0 Mar 18 12:14:56 crc kubenswrapper[4975]: I0318 12:14:56.977736 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5xm6" event={"ID":"94a0b37e-4423-421c-910e-658cb59e08c8","Type":"ContainerDied","Data":"c08ce5c839bdd21969c1d9f12eca76efe2eadfeb126efd13ee6d5fe109762bcf"} Mar 18 12:14:56 crc kubenswrapper[4975]: I0318 12:14:56.980217 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94vnq" event={"ID":"9002f360-1ea5-4b24-a49a-69f46a658936","Type":"ContainerStarted","Data":"3d49650a84899bae1213dd2757d55a47c6a0f5855f1dc7896ee721a5f391962a"} Mar 18 12:14:57 crc kubenswrapper[4975]: I0318 12:14:57.986360 4975 generic.go:334] "Generic (PLEG): container finished" podID="9002f360-1ea5-4b24-a49a-69f46a658936" containerID="3d49650a84899bae1213dd2757d55a47c6a0f5855f1dc7896ee721a5f391962a" exitCode=0 Mar 18 12:14:57 crc kubenswrapper[4975]: I0318 12:14:57.986455 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94vnq" event={"ID":"9002f360-1ea5-4b24-a49a-69f46a658936","Type":"ContainerDied","Data":"3d49650a84899bae1213dd2757d55a47c6a0f5855f1dc7896ee721a5f391962a"} Mar 18 12:14:57 crc kubenswrapper[4975]: I0318 12:14:57.990054 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hprxg" event={"ID":"a7a76930-86ba-4055-85e0-6053832da1aa","Type":"ContainerStarted","Data":"59c936ff2e20ac4a1736da8a313066426e7ebc240edcfc1d4f63bca6927f1cfd"} Mar 18 12:14:57 crc kubenswrapper[4975]: I0318 12:14:57.992925 4975 generic.go:334] "Generic (PLEG): container finished" podID="21b9dc77-7653-4684-ba67-cece256c42e2" containerID="16a790b0f67e55d709c51fbff50242a8113fadf1d483388a139c51971f249423" exitCode=0 Mar 18 12:14:57 crc kubenswrapper[4975]: I0318 12:14:57.992980 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whr68" event={"ID":"21b9dc77-7653-4684-ba67-cece256c42e2","Type":"ContainerDied","Data":"16a790b0f67e55d709c51fbff50242a8113fadf1d483388a139c51971f249423"} Mar 18 12:14:57 crc kubenswrapper[4975]: I0318 12:14:57.995433 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5xm6" event={"ID":"94a0b37e-4423-421c-910e-658cb59e08c8","Type":"ContainerStarted","Data":"0c85f5698a37155e9aaad69812cc146a7a356b6f81581ecff39443f95a5ef96f"} Mar 18 12:14:58 crc kubenswrapper[4975]: I0318 12:14:58.071580 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c5xm6" podStartSLOduration=3.516421269 podStartE2EDuration="58.071549499s" podCreationTimestamp="2026-03-18 12:14:00 +0000 UTC" firstStartedPulling="2026-03-18 12:14:02.86415317 +0000 UTC m=+228.578553749" lastFinishedPulling="2026-03-18 12:14:57.41928139 +0000 UTC m=+283.133681979" observedRunningTime="2026-03-18 12:14:58.07011918 +0000 UTC m=+283.784519759" watchObservedRunningTime="2026-03-18 12:14:58.071549499 +0000 UTC m=+283.785950078" Mar 18 12:14:58 crc kubenswrapper[4975]: I0318 12:14:58.199343 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w"] Mar 18 12:14:58 crc kubenswrapper[4975]: I0318 12:14:58.199552 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" podUID="01f8de13-7985-44d3-a987-949f63570885" containerName="controller-manager" containerID="cri-o://3f448903a5761a43f05868a401b39e315a2fb0794e5505dfc66c0ef2208c10b0" gracePeriod=30 Mar 18 12:14:58 crc kubenswrapper[4975]: I0318 12:14:58.217646 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l"] Mar 18 12:14:58 crc kubenswrapper[4975]: I0318 12:14:58.217912 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" podUID="2798fc75-d009-42cb-8bf3-e834d4271024" containerName="route-controller-manager" containerID="cri-o://554f7adc17c3f8cc530912b5bee4e32df809d74d9a4d33634850302c542597e0" gracePeriod=30 Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.001471 4975 generic.go:334] "Generic (PLEG): container finished" podID="01f8de13-7985-44d3-a987-949f63570885" containerID="3f448903a5761a43f05868a401b39e315a2fb0794e5505dfc66c0ef2208c10b0" exitCode=0 Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.001543 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" event={"ID":"01f8de13-7985-44d3-a987-949f63570885","Type":"ContainerDied","Data":"3f448903a5761a43f05868a401b39e315a2fb0794e5505dfc66c0ef2208c10b0"} Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.002901 4975 generic.go:334] "Generic (PLEG): container finished" podID="2798fc75-d009-42cb-8bf3-e834d4271024" containerID="554f7adc17c3f8cc530912b5bee4e32df809d74d9a4d33634850302c542597e0" exitCode=0 Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.002939 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" event={"ID":"2798fc75-d009-42cb-8bf3-e834d4271024","Type":"ContainerDied","Data":"554f7adc17c3f8cc530912b5bee4e32df809d74d9a4d33634850302c542597e0"} Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.004547 4975 generic.go:334] "Generic (PLEG): container finished" podID="a7a76930-86ba-4055-85e0-6053832da1aa" containerID="59c936ff2e20ac4a1736da8a313066426e7ebc240edcfc1d4f63bca6927f1cfd" exitCode=0 Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.004577 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hprxg" event={"ID":"a7a76930-86ba-4055-85e0-6053832da1aa","Type":"ContainerDied","Data":"59c936ff2e20ac4a1736da8a313066426e7ebc240edcfc1d4f63bca6927f1cfd"} Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.410535 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.436120 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2"] Mar 18 12:14:59 crc kubenswrapper[4975]: E0318 12:14:59.436350 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2798fc75-d009-42cb-8bf3-e834d4271024" containerName="route-controller-manager" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.436368 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="2798fc75-d009-42cb-8bf3-e834d4271024" containerName="route-controller-manager" Mar 18 12:14:59 crc kubenswrapper[4975]: E0318 12:14:59.436382 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb5204e4-c110-485b-8627-807fdb7f4c27" containerName="oc" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.436389 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb5204e4-c110-485b-8627-807fdb7f4c27" containerName="oc" Mar 18 12:14:59 crc kubenswrapper[4975]: E0318 12:14:59.436401 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d72ac7c-4ce8-4d23-a845-d359bca0544a" containerName="oc" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.436408 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d72ac7c-4ce8-4d23-a845-d359bca0544a" containerName="oc" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.436495 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d72ac7c-4ce8-4d23-a845-d359bca0544a" containerName="oc" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.436508 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="2798fc75-d009-42cb-8bf3-e834d4271024" containerName="route-controller-manager" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.436515 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb5204e4-c110-485b-8627-807fdb7f4c27" containerName="oc" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.436839 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.451339 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2"] Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.549008 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldtpj\" (UniqueName: \"kubernetes.io/projected/2798fc75-d009-42cb-8bf3-e834d4271024-kube-api-access-ldtpj\") pod \"2798fc75-d009-42cb-8bf3-e834d4271024\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.549140 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2798fc75-d009-42cb-8bf3-e834d4271024-client-ca\") pod \"2798fc75-d009-42cb-8bf3-e834d4271024\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.549167 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2798fc75-d009-42cb-8bf3-e834d4271024-config\") pod \"2798fc75-d009-42cb-8bf3-e834d4271024\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.549206 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2798fc75-d009-42cb-8bf3-e834d4271024-serving-cert\") pod \"2798fc75-d009-42cb-8bf3-e834d4271024\" (UID: \"2798fc75-d009-42cb-8bf3-e834d4271024\") " Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.549428 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2d63d0b-0d29-4614-9c3d-211f6a298d92-client-ca\") pod \"route-controller-manager-6fbb8b9b87-cdzh2\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.549462 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d63d0b-0d29-4614-9c3d-211f6a298d92-config\") pod \"route-controller-manager-6fbb8b9b87-cdzh2\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.549489 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d63d0b-0d29-4614-9c3d-211f6a298d92-serving-cert\") pod \"route-controller-manager-6fbb8b9b87-cdzh2\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.549512 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmrhf\" (UniqueName: \"kubernetes.io/projected/d2d63d0b-0d29-4614-9c3d-211f6a298d92-kube-api-access-bmrhf\") pod \"route-controller-manager-6fbb8b9b87-cdzh2\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.550112 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2798fc75-d009-42cb-8bf3-e834d4271024-client-ca" (OuterVolumeSpecName: "client-ca") pod "2798fc75-d009-42cb-8bf3-e834d4271024" (UID: "2798fc75-d009-42cb-8bf3-e834d4271024"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.550143 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2798fc75-d009-42cb-8bf3-e834d4271024-config" (OuterVolumeSpecName: "config") pod "2798fc75-d009-42cb-8bf3-e834d4271024" (UID: "2798fc75-d009-42cb-8bf3-e834d4271024"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.555045 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2798fc75-d009-42cb-8bf3-e834d4271024-kube-api-access-ldtpj" (OuterVolumeSpecName: "kube-api-access-ldtpj") pod "2798fc75-d009-42cb-8bf3-e834d4271024" (UID: "2798fc75-d009-42cb-8bf3-e834d4271024"). InnerVolumeSpecName "kube-api-access-ldtpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.569576 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2798fc75-d009-42cb-8bf3-e834d4271024-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2798fc75-d009-42cb-8bf3-e834d4271024" (UID: "2798fc75-d009-42cb-8bf3-e834d4271024"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.650663 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2d63d0b-0d29-4614-9c3d-211f6a298d92-client-ca\") pod \"route-controller-manager-6fbb8b9b87-cdzh2\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.650714 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d63d0b-0d29-4614-9c3d-211f6a298d92-config\") pod \"route-controller-manager-6fbb8b9b87-cdzh2\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.650739 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d63d0b-0d29-4614-9c3d-211f6a298d92-serving-cert\") pod \"route-controller-manager-6fbb8b9b87-cdzh2\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.650758 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmrhf\" (UniqueName: \"kubernetes.io/projected/d2d63d0b-0d29-4614-9c3d-211f6a298d92-kube-api-access-bmrhf\") pod \"route-controller-manager-6fbb8b9b87-cdzh2\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.650826 4975 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2798fc75-d009-42cb-8bf3-e834d4271024-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.650837 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2798fc75-d009-42cb-8bf3-e834d4271024-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.650845 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2798fc75-d009-42cb-8bf3-e834d4271024-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.650854 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldtpj\" (UniqueName: \"kubernetes.io/projected/2798fc75-d009-42cb-8bf3-e834d4271024-kube-api-access-ldtpj\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.651568 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2d63d0b-0d29-4614-9c3d-211f6a298d92-client-ca\") pod \"route-controller-manager-6fbb8b9b87-cdzh2\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.652256 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d63d0b-0d29-4614-9c3d-211f6a298d92-config\") pod \"route-controller-manager-6fbb8b9b87-cdzh2\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.654193 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.654598 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d63d0b-0d29-4614-9c3d-211f6a298d92-serving-cert\") pod \"route-controller-manager-6fbb8b9b87-cdzh2\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.668753 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmrhf\" (UniqueName: \"kubernetes.io/projected/d2d63d0b-0d29-4614-9c3d-211f6a298d92-kube-api-access-bmrhf\") pod \"route-controller-manager-6fbb8b9b87-cdzh2\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.751561 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk2fc\" (UniqueName: \"kubernetes.io/projected/01f8de13-7985-44d3-a987-949f63570885-kube-api-access-bk2fc\") pod \"01f8de13-7985-44d3-a987-949f63570885\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.751684 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01f8de13-7985-44d3-a987-949f63570885-serving-cert\") pod \"01f8de13-7985-44d3-a987-949f63570885\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.751712 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-client-ca\") pod \"01f8de13-7985-44d3-a987-949f63570885\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.751737 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-config\") pod \"01f8de13-7985-44d3-a987-949f63570885\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.751774 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-proxy-ca-bundles\") pod \"01f8de13-7985-44d3-a987-949f63570885\" (UID: \"01f8de13-7985-44d3-a987-949f63570885\") " Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.752592 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "01f8de13-7985-44d3-a987-949f63570885" (UID: "01f8de13-7985-44d3-a987-949f63570885"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.752719 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-client-ca" (OuterVolumeSpecName: "client-ca") pod "01f8de13-7985-44d3-a987-949f63570885" (UID: "01f8de13-7985-44d3-a987-949f63570885"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.752732 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-config" (OuterVolumeSpecName: "config") pod "01f8de13-7985-44d3-a987-949f63570885" (UID: "01f8de13-7985-44d3-a987-949f63570885"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.754558 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f8de13-7985-44d3-a987-949f63570885-kube-api-access-bk2fc" (OuterVolumeSpecName: "kube-api-access-bk2fc") pod "01f8de13-7985-44d3-a987-949f63570885" (UID: "01f8de13-7985-44d3-a987-949f63570885"). InnerVolumeSpecName "kube-api-access-bk2fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.755517 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f8de13-7985-44d3-a987-949f63570885-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01f8de13-7985-44d3-a987-949f63570885" (UID: "01f8de13-7985-44d3-a987-949f63570885"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.758830 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.853388 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01f8de13-7985-44d3-a987-949f63570885-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.853644 4975 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.853707 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.853770 4975 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01f8de13-7985-44d3-a987-949f63570885-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:14:59 crc kubenswrapper[4975]: I0318 12:14:59.853824 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk2fc\" (UniqueName: \"kubernetes.io/projected/01f8de13-7985-44d3-a987-949f63570885-kube-api-access-bk2fc\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.010485 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" event={"ID":"2798fc75-d009-42cb-8bf3-e834d4271024","Type":"ContainerDied","Data":"e96c13f8476f01a8c19f6824c617012ef2a2835561e143d76cea75b1a21b536e"} Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.010510 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.010810 4975 scope.go:117] "RemoveContainer" containerID="554f7adc17c3f8cc530912b5bee4e32df809d74d9a4d33634850302c542597e0" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.012938 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" event={"ID":"01f8de13-7985-44d3-a987-949f63570885","Type":"ContainerDied","Data":"ca688add4f90f5318093a5d1203754bbe6e55df99908ee3fde0e50c3368206f4"} Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.013016 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.039457 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l"] Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.045883 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76dff5dcbd-rlr6l"] Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.054727 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w"] Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.057404 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-75cd8bcbdd-qs87w"] Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.132536 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8"] Mar 18 12:15:00 crc kubenswrapper[4975]: E0318 12:15:00.132761 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f8de13-7985-44d3-a987-949f63570885" containerName="controller-manager" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.132777 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f8de13-7985-44d3-a987-949f63570885" containerName="controller-manager" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.132938 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f8de13-7985-44d3-a987-949f63570885" containerName="controller-manager" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.133422 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.135656 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.135743 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.139720 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-d4zht" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.143123 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8"] Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.258349 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85462245-ccc9-46f5-8bcb-a2648e9f1488-secret-volume\") pod \"collect-profiles-29563935-48zj8\" (UID: \"85462245-ccc9-46f5-8bcb-a2648e9f1488\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.258488 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8cbr\" (UniqueName: \"kubernetes.io/projected/85462245-ccc9-46f5-8bcb-a2648e9f1488-kube-api-access-w8cbr\") pod \"collect-profiles-29563935-48zj8\" (UID: \"85462245-ccc9-46f5-8bcb-a2648e9f1488\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.258516 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85462245-ccc9-46f5-8bcb-a2648e9f1488-config-volume\") pod \"collect-profiles-29563935-48zj8\" (UID: \"85462245-ccc9-46f5-8bcb-a2648e9f1488\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.360171 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8cbr\" (UniqueName: \"kubernetes.io/projected/85462245-ccc9-46f5-8bcb-a2648e9f1488-kube-api-access-w8cbr\") pod \"collect-profiles-29563935-48zj8\" (UID: \"85462245-ccc9-46f5-8bcb-a2648e9f1488\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.360233 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85462245-ccc9-46f5-8bcb-a2648e9f1488-config-volume\") pod \"collect-profiles-29563935-48zj8\" (UID: \"85462245-ccc9-46f5-8bcb-a2648e9f1488\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.360270 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85462245-ccc9-46f5-8bcb-a2648e9f1488-secret-volume\") pod \"collect-profiles-29563935-48zj8\" (UID: \"85462245-ccc9-46f5-8bcb-a2648e9f1488\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.399135 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85462245-ccc9-46f5-8bcb-a2648e9f1488-config-volume\") pod \"collect-profiles-29563935-48zj8\" (UID: \"85462245-ccc9-46f5-8bcb-a2648e9f1488\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.400926 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85462245-ccc9-46f5-8bcb-a2648e9f1488-secret-volume\") pod \"collect-profiles-29563935-48zj8\" (UID: \"85462245-ccc9-46f5-8bcb-a2648e9f1488\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.404411 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8cbr\" (UniqueName: \"kubernetes.io/projected/85462245-ccc9-46f5-8bcb-a2648e9f1488-kube-api-access-w8cbr\") pod \"collect-profiles-29563935-48zj8\" (UID: \"85462245-ccc9-46f5-8bcb-a2648e9f1488\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.435615 4975 scope.go:117] "RemoveContainer" containerID="3f448903a5761a43f05868a401b39e315a2fb0794e5505dfc66c0ef2208c10b0" Mar 18 12:15:00 crc kubenswrapper[4975]: I0318 12:15:00.451196 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" Mar 18 12:15:01 crc kubenswrapper[4975]: I0318 12:15:01.024433 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f8de13-7985-44d3-a987-949f63570885" path="/var/lib/kubelet/pods/01f8de13-7985-44d3-a987-949f63570885/volumes" Mar 18 12:15:01 crc kubenswrapper[4975]: I0318 12:15:01.025460 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2798fc75-d009-42cb-8bf3-e834d4271024" path="/var/lib/kubelet/pods/2798fc75-d009-42cb-8bf3-e834d4271024/volumes" Mar 18 12:15:01 crc kubenswrapper[4975]: I0318 12:15:01.247823 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:15:01 crc kubenswrapper[4975]: I0318 12:15:01.247903 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:15:01 crc kubenswrapper[4975]: I0318 12:15:01.328013 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2"] Mar 18 12:15:01 crc kubenswrapper[4975]: I0318 12:15:01.330428 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:15:01 crc kubenswrapper[4975]: W0318 12:15:01.791467 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2d63d0b_0d29_4614_9c3d_211f6a298d92.slice/crio-e986cd9886e5161433714c3cc5854b815c4aa70da65b129ca8e6dbc1813d70f9 WatchSource:0}: Error finding container e986cd9886e5161433714c3cc5854b815c4aa70da65b129ca8e6dbc1813d70f9: Status 404 returned error can't find the container with id e986cd9886e5161433714c3cc5854b815c4aa70da65b129ca8e6dbc1813d70f9 Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.027609 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" event={"ID":"d2d63d0b-0d29-4614-9c3d-211f6a298d92","Type":"ContainerStarted","Data":"e986cd9886e5161433714c3cc5854b815c4aa70da65b129ca8e6dbc1813d70f9"} Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.084931 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.213974 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8"] Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.360651 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8bc487549-7nvhs"] Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.361600 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.364738 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.364767 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.364747 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.365211 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.365243 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.370177 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.372761 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.374517 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8bc487549-7nvhs"] Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.493394 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-proxy-ca-bundles\") pod \"controller-manager-8bc487549-7nvhs\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.493432 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bd18011-0374-4806-96e3-613abb41d92e-serving-cert\") pod \"controller-manager-8bc487549-7nvhs\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.493514 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-client-ca\") pod \"controller-manager-8bc487549-7nvhs\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.493545 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdtdz\" (UniqueName: \"kubernetes.io/projected/9bd18011-0374-4806-96e3-613abb41d92e-kube-api-access-qdtdz\") pod \"controller-manager-8bc487549-7nvhs\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.493579 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-config\") pod \"controller-manager-8bc487549-7nvhs\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.594526 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-client-ca\") pod \"controller-manager-8bc487549-7nvhs\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.594597 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdtdz\" (UniqueName: \"kubernetes.io/projected/9bd18011-0374-4806-96e3-613abb41d92e-kube-api-access-qdtdz\") pod \"controller-manager-8bc487549-7nvhs\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.594657 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-config\") pod \"controller-manager-8bc487549-7nvhs\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.594695 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-proxy-ca-bundles\") pod \"controller-manager-8bc487549-7nvhs\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.594720 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bd18011-0374-4806-96e3-613abb41d92e-serving-cert\") pod \"controller-manager-8bc487549-7nvhs\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.595698 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-client-ca\") pod \"controller-manager-8bc487549-7nvhs\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.596679 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-config\") pod \"controller-manager-8bc487549-7nvhs\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.598395 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-proxy-ca-bundles\") pod \"controller-manager-8bc487549-7nvhs\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.604630 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bd18011-0374-4806-96e3-613abb41d92e-serving-cert\") pod \"controller-manager-8bc487549-7nvhs\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.615604 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdtdz\" (UniqueName: \"kubernetes.io/projected/9bd18011-0374-4806-96e3-613abb41d92e-kube-api-access-qdtdz\") pod \"controller-manager-8bc487549-7nvhs\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:02 crc kubenswrapper[4975]: I0318 12:15:02.679893 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:03 crc kubenswrapper[4975]: I0318 12:15:03.035879 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnfcw" event={"ID":"3b24c4ea-1b55-429c-97f5-376523ea1a52","Type":"ContainerStarted","Data":"70a24895c4068290f45c505e21a1f9e819b9750e1a3cd6c3b5f21979538a89b8"} Mar 18 12:15:03 crc kubenswrapper[4975]: I0318 12:15:03.038193 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" event={"ID":"d2d63d0b-0d29-4614-9c3d-211f6a298d92","Type":"ContainerStarted","Data":"9db20d18fe6b2cd8a3fdf13e6f17f6767a6b5572bcf0b1ea915b8946f3e930db"} Mar 18 12:15:03 crc kubenswrapper[4975]: I0318 12:15:03.040538 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whr68" event={"ID":"21b9dc77-7653-4684-ba67-cece256c42e2","Type":"ContainerStarted","Data":"5dbf64d104d282f2010bd4d5c1d6483884c084c09ed27b2cb7589c1b194fccc1"} Mar 18 12:15:03 crc kubenswrapper[4975]: I0318 12:15:03.041573 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" event={"ID":"85462245-ccc9-46f5-8bcb-a2648e9f1488","Type":"ContainerStarted","Data":"dec493fc8b33842aa53fe5a580909db89917ef4eddd56fc18da9262b7e1ad2ce"} Mar 18 12:15:03 crc kubenswrapper[4975]: I0318 12:15:03.043167 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:15:03 crc kubenswrapper[4975]: I0318 12:15:03.048550 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:15:03 crc kubenswrapper[4975]: I0318 12:15:03.095132 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" podStartSLOduration=5.095111914 podStartE2EDuration="5.095111914s" podCreationTimestamp="2026-03-18 12:14:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:15:03.092542653 +0000 UTC m=+288.806943232" watchObservedRunningTime="2026-03-18 12:15:03.095111914 +0000 UTC m=+288.809512493" Mar 18 12:15:03 crc kubenswrapper[4975]: I0318 12:15:03.096138 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-whr68" podStartSLOduration=5.07977706 podStartE2EDuration="1m1.096130552s" podCreationTimestamp="2026-03-18 12:14:02 +0000 UTC" firstStartedPulling="2026-03-18 12:14:03.98619177 +0000 UTC m=+229.700592349" lastFinishedPulling="2026-03-18 12:15:00.002545262 +0000 UTC m=+285.716945841" observedRunningTime="2026-03-18 12:15:03.077446385 +0000 UTC m=+288.791846974" watchObservedRunningTime="2026-03-18 12:15:03.096130552 +0000 UTC m=+288.810531131" Mar 18 12:15:03 crc kubenswrapper[4975]: I0318 12:15:03.454761 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c5xm6"] Mar 18 12:15:04 crc kubenswrapper[4975]: I0318 12:15:04.048336 4975 generic.go:334] "Generic (PLEG): container finished" podID="3b24c4ea-1b55-429c-97f5-376523ea1a52" containerID="70a24895c4068290f45c505e21a1f9e819b9750e1a3cd6c3b5f21979538a89b8" exitCode=0 Mar 18 12:15:04 crc kubenswrapper[4975]: I0318 12:15:04.048407 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnfcw" event={"ID":"3b24c4ea-1b55-429c-97f5-376523ea1a52","Type":"ContainerDied","Data":"70a24895c4068290f45c505e21a1f9e819b9750e1a3cd6c3b5f21979538a89b8"} Mar 18 12:15:04 crc kubenswrapper[4975]: I0318 12:15:04.048581 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c5xm6" podUID="94a0b37e-4423-421c-910e-658cb59e08c8" containerName="registry-server" containerID="cri-o://0c85f5698a37155e9aaad69812cc146a7a356b6f81581ecff39443f95a5ef96f" gracePeriod=2 Mar 18 12:15:04 crc kubenswrapper[4975]: I0318 12:15:04.196703 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:15:04 crc kubenswrapper[4975]: I0318 12:15:04.235616 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:15:04 crc kubenswrapper[4975]: I0318 12:15:04.352474 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:15:04 crc kubenswrapper[4975]: I0318 12:15:04.387838 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:15:05 crc kubenswrapper[4975]: I0318 12:15:05.059210 4975 generic.go:334] "Generic (PLEG): container finished" podID="94a0b37e-4423-421c-910e-658cb59e08c8" containerID="0c85f5698a37155e9aaad69812cc146a7a356b6f81581ecff39443f95a5ef96f" exitCode=0 Mar 18 12:15:05 crc kubenswrapper[4975]: I0318 12:15:05.059311 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5xm6" event={"ID":"94a0b37e-4423-421c-910e-658cb59e08c8","Type":"ContainerDied","Data":"0c85f5698a37155e9aaad69812cc146a7a356b6f81581ecff39443f95a5ef96f"} Mar 18 12:15:05 crc kubenswrapper[4975]: I0318 12:15:05.281646 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8bc487549-7nvhs"] Mar 18 12:15:05 crc kubenswrapper[4975]: W0318 12:15:05.281783 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bd18011_0374_4806_96e3_613abb41d92e.slice/crio-bf2d8e76018f586064839515f2694801ed43e83e2845453fdf449dc68fbf4d0a WatchSource:0}: Error finding container bf2d8e76018f586064839515f2694801ed43e83e2845453fdf449dc68fbf4d0a: Status 404 returned error can't find the container with id bf2d8e76018f586064839515f2694801ed43e83e2845453fdf449dc68fbf4d0a Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.068045 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94vnq" event={"ID":"9002f360-1ea5-4b24-a49a-69f46a658936","Type":"ContainerStarted","Data":"aa2086afdd7eaf40d97eb9b328b0b3e9a5bf7f56f67b3741f4d70138d5ec1515"} Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.074644 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f75bb" event={"ID":"46171d59-3549-4843-b6eb-07b9eecd2560","Type":"ContainerStarted","Data":"b890c6e23112f402b556e7d927e2abe518eaf032d4e2de0a0fdbc4168240ce98"} Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.080466 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" event={"ID":"9bd18011-0374-4806-96e3-613abb41d92e","Type":"ContainerStarted","Data":"8d653b94b4ea5e3e640061a2407f9d2c388716f877c231b9105945a86360f981"} Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.080513 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" event={"ID":"9bd18011-0374-4806-96e3-613abb41d92e","Type":"ContainerStarted","Data":"bf2d8e76018f586064839515f2694801ed43e83e2845453fdf449dc68fbf4d0a"} Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.081969 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" event={"ID":"85462245-ccc9-46f5-8bcb-a2648e9f1488","Type":"ContainerStarted","Data":"9b76c3ce1af627eabe894e32aa1dee49016d590c39ebd040ec8c20cde28b1604"} Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.092231 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-94vnq" podStartSLOduration=5.193807982 podStartE2EDuration="1m6.092216172s" podCreationTimestamp="2026-03-18 12:14:00 +0000 UTC" firstStartedPulling="2026-03-18 12:14:03.945654165 +0000 UTC m=+229.660054744" lastFinishedPulling="2026-03-18 12:15:04.844062345 +0000 UTC m=+290.558462934" observedRunningTime="2026-03-18 12:15:06.088990843 +0000 UTC m=+291.803391412" watchObservedRunningTime="2026-03-18 12:15:06.092216172 +0000 UTC m=+291.806616751" Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.357283 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.378609 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" podStartSLOduration=6.378587385 podStartE2EDuration="6.378587385s" podCreationTimestamp="2026-03-18 12:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:15:06.13583544 +0000 UTC m=+291.850236039" watchObservedRunningTime="2026-03-18 12:15:06.378587385 +0000 UTC m=+292.092987964" Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.448440 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a0b37e-4423-421c-910e-658cb59e08c8-catalog-content\") pod \"94a0b37e-4423-421c-910e-658cb59e08c8\" (UID: \"94a0b37e-4423-421c-910e-658cb59e08c8\") " Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.448530 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlsx7\" (UniqueName: \"kubernetes.io/projected/94a0b37e-4423-421c-910e-658cb59e08c8-kube-api-access-wlsx7\") pod \"94a0b37e-4423-421c-910e-658cb59e08c8\" (UID: \"94a0b37e-4423-421c-910e-658cb59e08c8\") " Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.448641 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a0b37e-4423-421c-910e-658cb59e08c8-utilities\") pod \"94a0b37e-4423-421c-910e-658cb59e08c8\" (UID: \"94a0b37e-4423-421c-910e-658cb59e08c8\") " Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.451893 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a0b37e-4423-421c-910e-658cb59e08c8-utilities" (OuterVolumeSpecName: "utilities") pod "94a0b37e-4423-421c-910e-658cb59e08c8" (UID: "94a0b37e-4423-421c-910e-658cb59e08c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.472042 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a0b37e-4423-421c-910e-658cb59e08c8-kube-api-access-wlsx7" (OuterVolumeSpecName: "kube-api-access-wlsx7") pod "94a0b37e-4423-421c-910e-658cb59e08c8" (UID: "94a0b37e-4423-421c-910e-658cb59e08c8"). InnerVolumeSpecName "kube-api-access-wlsx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.516943 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a0b37e-4423-421c-910e-658cb59e08c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94a0b37e-4423-421c-910e-658cb59e08c8" (UID: "94a0b37e-4423-421c-910e-658cb59e08c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.550608 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlsx7\" (UniqueName: \"kubernetes.io/projected/94a0b37e-4423-421c-910e-658cb59e08c8-kube-api-access-wlsx7\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.550643 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a0b37e-4423-421c-910e-658cb59e08c8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:06 crc kubenswrapper[4975]: I0318 12:15:06.550657 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a0b37e-4423-421c-910e-658cb59e08c8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.088859 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hprxg" event={"ID":"a7a76930-86ba-4055-85e0-6053832da1aa","Type":"ContainerStarted","Data":"9995283fb4e2f5f55b62e97adff14daa9e743b017358e07549a8bd2e6ab91a84"} Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.091841 4975 generic.go:334] "Generic (PLEG): container finished" podID="46171d59-3549-4843-b6eb-07b9eecd2560" containerID="b890c6e23112f402b556e7d927e2abe518eaf032d4e2de0a0fdbc4168240ce98" exitCode=0 Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.091925 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f75bb" event={"ID":"46171d59-3549-4843-b6eb-07b9eecd2560","Type":"ContainerDied","Data":"b890c6e23112f402b556e7d927e2abe518eaf032d4e2de0a0fdbc4168240ce98"} Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.097286 4975 generic.go:334] "Generic (PLEG): container finished" podID="85462245-ccc9-46f5-8bcb-a2648e9f1488" containerID="9b76c3ce1af627eabe894e32aa1dee49016d590c39ebd040ec8c20cde28b1604" exitCode=0 Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.097360 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" event={"ID":"85462245-ccc9-46f5-8bcb-a2648e9f1488","Type":"ContainerDied","Data":"9b76c3ce1af627eabe894e32aa1dee49016d590c39ebd040ec8c20cde28b1604"} Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.101798 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5xm6" event={"ID":"94a0b37e-4423-421c-910e-658cb59e08c8","Type":"ContainerDied","Data":"dcc9a37d62286a9df3c0b6ef6470633a426b21cbc39170c9d82bcbeb8baa4015"} Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.101854 4975 scope.go:117] "RemoveContainer" containerID="0c85f5698a37155e9aaad69812cc146a7a356b6f81581ecff39443f95a5ef96f" Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.101920 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5xm6" Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.104519 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnfcw" event={"ID":"3b24c4ea-1b55-429c-97f5-376523ea1a52","Type":"ContainerStarted","Data":"78e09328c9f62714bfcb9b955585ff4ba30be3e8bed3618b80b08618937b9c3e"} Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.104852 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.109886 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.119354 4975 scope.go:117] "RemoveContainer" containerID="c08ce5c839bdd21969c1d9f12eca76efe2eadfeb126efd13ee6d5fe109762bcf" Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.121170 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hprxg" podStartSLOduration=3.9252233629999997 podStartE2EDuration="1m7.121147315s" podCreationTimestamp="2026-03-18 12:14:00 +0000 UTC" firstStartedPulling="2026-03-18 12:14:02.906169484 +0000 UTC m=+228.620570063" lastFinishedPulling="2026-03-18 12:15:06.102093436 +0000 UTC m=+291.816494015" observedRunningTime="2026-03-18 12:15:07.117201756 +0000 UTC m=+292.831602335" watchObservedRunningTime="2026-03-18 12:15:07.121147315 +0000 UTC m=+292.835547904" Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.137151 4975 scope.go:117] "RemoveContainer" containerID="9c8d7b22644cfa3e6e6bf88b9a83665cf9884b9f63a450f8831182f608634412" Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.158345 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c5xm6"] Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.163908 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c5xm6"] Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.176535 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" podStartSLOduration=9.176515409 podStartE2EDuration="9.176515409s" podCreationTimestamp="2026-03-18 12:14:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:15:07.173551307 +0000 UTC m=+292.887951886" watchObservedRunningTime="2026-03-18 12:15:07.176515409 +0000 UTC m=+292.890915988" Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.224568 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xnfcw" podStartSLOduration=3.88684796 podStartE2EDuration="1m7.22455282s" podCreationTimestamp="2026-03-18 12:14:00 +0000 UTC" firstStartedPulling="2026-03-18 12:14:02.906309257 +0000 UTC m=+228.620709836" lastFinishedPulling="2026-03-18 12:15:06.244014117 +0000 UTC m=+291.958414696" observedRunningTime="2026-03-18 12:15:07.22419635 +0000 UTC m=+292.938596939" watchObservedRunningTime="2026-03-18 12:15:07.22455282 +0000 UTC m=+292.938953389" Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.855912 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lbw76"] Mar 18 12:15:07 crc kubenswrapper[4975]: I0318 12:15:07.856504 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lbw76" podUID="cedb84ff-ae71-4210-8f65-16441f4292ac" containerName="registry-server" containerID="cri-o://b4110a2008d9b23f3c77daa0c5e95cf50a7124e0d5a23942dfc264682aa6312a" gracePeriod=2 Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.112081 4975 generic.go:334] "Generic (PLEG): container finished" podID="cedb84ff-ae71-4210-8f65-16441f4292ac" containerID="b4110a2008d9b23f3c77daa0c5e95cf50a7124e0d5a23942dfc264682aa6312a" exitCode=0 Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.112132 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbw76" event={"ID":"cedb84ff-ae71-4210-8f65-16441f4292ac","Type":"ContainerDied","Data":"b4110a2008d9b23f3c77daa0c5e95cf50a7124e0d5a23942dfc264682aa6312a"} Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.514698 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.523138 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.579675 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q6r4\" (UniqueName: \"kubernetes.io/projected/cedb84ff-ae71-4210-8f65-16441f4292ac-kube-api-access-7q6r4\") pod \"cedb84ff-ae71-4210-8f65-16441f4292ac\" (UID: \"cedb84ff-ae71-4210-8f65-16441f4292ac\") " Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.579968 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cedb84ff-ae71-4210-8f65-16441f4292ac-utilities\") pod \"cedb84ff-ae71-4210-8f65-16441f4292ac\" (UID: \"cedb84ff-ae71-4210-8f65-16441f4292ac\") " Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.580060 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8cbr\" (UniqueName: \"kubernetes.io/projected/85462245-ccc9-46f5-8bcb-a2648e9f1488-kube-api-access-w8cbr\") pod \"85462245-ccc9-46f5-8bcb-a2648e9f1488\" (UID: \"85462245-ccc9-46f5-8bcb-a2648e9f1488\") " Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.580180 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85462245-ccc9-46f5-8bcb-a2648e9f1488-config-volume\") pod \"85462245-ccc9-46f5-8bcb-a2648e9f1488\" (UID: \"85462245-ccc9-46f5-8bcb-a2648e9f1488\") " Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.580262 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85462245-ccc9-46f5-8bcb-a2648e9f1488-secret-volume\") pod \"85462245-ccc9-46f5-8bcb-a2648e9f1488\" (UID: \"85462245-ccc9-46f5-8bcb-a2648e9f1488\") " Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.580398 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cedb84ff-ae71-4210-8f65-16441f4292ac-catalog-content\") pod \"cedb84ff-ae71-4210-8f65-16441f4292ac\" (UID: \"cedb84ff-ae71-4210-8f65-16441f4292ac\") " Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.581129 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cedb84ff-ae71-4210-8f65-16441f4292ac-utilities" (OuterVolumeSpecName: "utilities") pod "cedb84ff-ae71-4210-8f65-16441f4292ac" (UID: "cedb84ff-ae71-4210-8f65-16441f4292ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.581379 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85462245-ccc9-46f5-8bcb-a2648e9f1488-config-volume" (OuterVolumeSpecName: "config-volume") pod "85462245-ccc9-46f5-8bcb-a2648e9f1488" (UID: "85462245-ccc9-46f5-8bcb-a2648e9f1488"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.587158 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85462245-ccc9-46f5-8bcb-a2648e9f1488-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "85462245-ccc9-46f5-8bcb-a2648e9f1488" (UID: "85462245-ccc9-46f5-8bcb-a2648e9f1488"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.592230 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85462245-ccc9-46f5-8bcb-a2648e9f1488-kube-api-access-w8cbr" (OuterVolumeSpecName: "kube-api-access-w8cbr") pod "85462245-ccc9-46f5-8bcb-a2648e9f1488" (UID: "85462245-ccc9-46f5-8bcb-a2648e9f1488"). InnerVolumeSpecName "kube-api-access-w8cbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.592834 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cedb84ff-ae71-4210-8f65-16441f4292ac-kube-api-access-7q6r4" (OuterVolumeSpecName: "kube-api-access-7q6r4") pod "cedb84ff-ae71-4210-8f65-16441f4292ac" (UID: "cedb84ff-ae71-4210-8f65-16441f4292ac"). InnerVolumeSpecName "kube-api-access-7q6r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.683520 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q6r4\" (UniqueName: \"kubernetes.io/projected/cedb84ff-ae71-4210-8f65-16441f4292ac-kube-api-access-7q6r4\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.683551 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cedb84ff-ae71-4210-8f65-16441f4292ac-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.683562 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8cbr\" (UniqueName: \"kubernetes.io/projected/85462245-ccc9-46f5-8bcb-a2648e9f1488-kube-api-access-w8cbr\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.683571 4975 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85462245-ccc9-46f5-8bcb-a2648e9f1488-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.683579 4975 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85462245-ccc9-46f5-8bcb-a2648e9f1488-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.737049 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cedb84ff-ae71-4210-8f65-16441f4292ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cedb84ff-ae71-4210-8f65-16441f4292ac" (UID: "cedb84ff-ae71-4210-8f65-16441f4292ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:08 crc kubenswrapper[4975]: I0318 12:15:08.785083 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cedb84ff-ae71-4210-8f65-16441f4292ac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:09 crc kubenswrapper[4975]: I0318 12:15:09.024578 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a0b37e-4423-421c-910e-658cb59e08c8" path="/var/lib/kubelet/pods/94a0b37e-4423-421c-910e-658cb59e08c8/volumes" Mar 18 12:15:09 crc kubenswrapper[4975]: I0318 12:15:09.122269 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbw76" event={"ID":"cedb84ff-ae71-4210-8f65-16441f4292ac","Type":"ContainerDied","Data":"ecc209509b0446a559b724aad0d902836c816113e2e6cdf3ccb6bec44fab0b54"} Mar 18 12:15:09 crc kubenswrapper[4975]: I0318 12:15:09.122305 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbw76" Mar 18 12:15:09 crc kubenswrapper[4975]: I0318 12:15:09.123750 4975 scope.go:117] "RemoveContainer" containerID="b4110a2008d9b23f3c77daa0c5e95cf50a7124e0d5a23942dfc264682aa6312a" Mar 18 12:15:09 crc kubenswrapper[4975]: I0318 12:15:09.124750 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" event={"ID":"85462245-ccc9-46f5-8bcb-a2648e9f1488","Type":"ContainerDied","Data":"dec493fc8b33842aa53fe5a580909db89917ef4eddd56fc18da9262b7e1ad2ce"} Mar 18 12:15:09 crc kubenswrapper[4975]: I0318 12:15:09.124774 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8" Mar 18 12:15:09 crc kubenswrapper[4975]: I0318 12:15:09.125135 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dec493fc8b33842aa53fe5a580909db89917ef4eddd56fc18da9262b7e1ad2ce" Mar 18 12:15:09 crc kubenswrapper[4975]: I0318 12:15:09.141623 4975 scope.go:117] "RemoveContainer" containerID="06a129fa931062672542451d6f75d48e0d1653eb72af6a6fef5789e3fcf61d38" Mar 18 12:15:09 crc kubenswrapper[4975]: I0318 12:15:09.143137 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lbw76"] Mar 18 12:15:09 crc kubenswrapper[4975]: I0318 12:15:09.148096 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lbw76"] Mar 18 12:15:09 crc kubenswrapper[4975]: I0318 12:15:09.163050 4975 scope.go:117] "RemoveContainer" containerID="272acefdc7ed866baa4e1e025f29c33c8afc340cf50b83c0af293d18751c83ec" Mar 18 12:15:10 crc kubenswrapper[4975]: I0318 12:15:10.697808 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:15:10 crc kubenswrapper[4975]: I0318 12:15:10.699019 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:15:10 crc kubenswrapper[4975]: I0318 12:15:10.812289 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:15:10 crc kubenswrapper[4975]: I0318 12:15:10.892365 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:15:10 crc kubenswrapper[4975]: I0318 12:15:10.892436 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:15:10 crc kubenswrapper[4975]: I0318 12:15:10.949947 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:15:11 crc kubenswrapper[4975]: I0318 12:15:11.023297 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cedb84ff-ae71-4210-8f65-16441f4292ac" path="/var/lib/kubelet/pods/cedb84ff-ae71-4210-8f65-16441f4292ac/volumes" Mar 18 12:15:11 crc kubenswrapper[4975]: I0318 12:15:11.141337 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f75bb" event={"ID":"46171d59-3549-4843-b6eb-07b9eecd2560","Type":"ContainerStarted","Data":"625745afe2c70e7fbcbeb00971e7c15ae66a9f676e8a646d9caf41d1563d91cc"} Mar 18 12:15:11 crc kubenswrapper[4975]: I0318 12:15:11.194496 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:15:11 crc kubenswrapper[4975]: I0318 12:15:11.197075 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:15:11 crc kubenswrapper[4975]: I0318 12:15:11.213267 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f75bb" podStartSLOduration=4.369852645 podStartE2EDuration="1m9.213246837s" podCreationTimestamp="2026-03-18 12:14:02 +0000 UTC" firstStartedPulling="2026-03-18 12:14:05.152816331 +0000 UTC m=+230.867216910" lastFinishedPulling="2026-03-18 12:15:09.996210533 +0000 UTC m=+295.710611102" observedRunningTime="2026-03-18 12:15:11.178554446 +0000 UTC m=+296.892955175" watchObservedRunningTime="2026-03-18 12:15:11.213246837 +0000 UTC m=+296.927647416" Mar 18 12:15:11 crc kubenswrapper[4975]: I0318 12:15:11.480900 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:15:11 crc kubenswrapper[4975]: I0318 12:15:11.480955 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:15:11 crc kubenswrapper[4975]: I0318 12:15:11.522398 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:15:12 crc kubenswrapper[4975]: I0318 12:15:12.233251 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:15:12 crc kubenswrapper[4975]: I0318 12:15:12.865912 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:15:12 crc kubenswrapper[4975]: I0318 12:15:12.865962 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:15:12 crc kubenswrapper[4975]: I0318 12:15:12.927808 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:15:13 crc kubenswrapper[4975]: I0318 12:15:13.197484 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:15:13 crc kubenswrapper[4975]: I0318 12:15:13.253065 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94vnq"] Mar 18 12:15:13 crc kubenswrapper[4975]: I0318 12:15:13.319285 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:15:13 crc kubenswrapper[4975]: I0318 12:15:13.319334 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:15:13 crc kubenswrapper[4975]: I0318 12:15:13.361659 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:15:14 crc kubenswrapper[4975]: I0318 12:15:14.161043 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-94vnq" podUID="9002f360-1ea5-4b24-a49a-69f46a658936" containerName="registry-server" containerID="cri-o://aa2086afdd7eaf40d97eb9b328b0b3e9a5bf7f56f67b3741f4d70138d5ec1515" gracePeriod=2 Mar 18 12:15:14 crc kubenswrapper[4975]: I0318 12:15:14.620097 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:15:14 crc kubenswrapper[4975]: I0318 12:15:14.765393 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9002f360-1ea5-4b24-a49a-69f46a658936-catalog-content\") pod \"9002f360-1ea5-4b24-a49a-69f46a658936\" (UID: \"9002f360-1ea5-4b24-a49a-69f46a658936\") " Mar 18 12:15:14 crc kubenswrapper[4975]: I0318 12:15:14.765489 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr6ms\" (UniqueName: \"kubernetes.io/projected/9002f360-1ea5-4b24-a49a-69f46a658936-kube-api-access-qr6ms\") pod \"9002f360-1ea5-4b24-a49a-69f46a658936\" (UID: \"9002f360-1ea5-4b24-a49a-69f46a658936\") " Mar 18 12:15:14 crc kubenswrapper[4975]: I0318 12:15:14.765551 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9002f360-1ea5-4b24-a49a-69f46a658936-utilities\") pod \"9002f360-1ea5-4b24-a49a-69f46a658936\" (UID: \"9002f360-1ea5-4b24-a49a-69f46a658936\") " Mar 18 12:15:14 crc kubenswrapper[4975]: I0318 12:15:14.766517 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9002f360-1ea5-4b24-a49a-69f46a658936-utilities" (OuterVolumeSpecName: "utilities") pod "9002f360-1ea5-4b24-a49a-69f46a658936" (UID: "9002f360-1ea5-4b24-a49a-69f46a658936"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:14 crc kubenswrapper[4975]: I0318 12:15:14.770992 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9002f360-1ea5-4b24-a49a-69f46a658936-kube-api-access-qr6ms" (OuterVolumeSpecName: "kube-api-access-qr6ms") pod "9002f360-1ea5-4b24-a49a-69f46a658936" (UID: "9002f360-1ea5-4b24-a49a-69f46a658936"). InnerVolumeSpecName "kube-api-access-qr6ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:14 crc kubenswrapper[4975]: I0318 12:15:14.820848 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9002f360-1ea5-4b24-a49a-69f46a658936-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9002f360-1ea5-4b24-a49a-69f46a658936" (UID: "9002f360-1ea5-4b24-a49a-69f46a658936"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:15:14 crc kubenswrapper[4975]: I0318 12:15:14.867687 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr6ms\" (UniqueName: \"kubernetes.io/projected/9002f360-1ea5-4b24-a49a-69f46a658936-kube-api-access-qr6ms\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:14 crc kubenswrapper[4975]: I0318 12:15:14.867752 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9002f360-1ea5-4b24-a49a-69f46a658936-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:14 crc kubenswrapper[4975]: I0318 12:15:14.867774 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9002f360-1ea5-4b24-a49a-69f46a658936-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:15 crc kubenswrapper[4975]: I0318 12:15:15.169331 4975 generic.go:334] "Generic (PLEG): container finished" podID="9002f360-1ea5-4b24-a49a-69f46a658936" containerID="aa2086afdd7eaf40d97eb9b328b0b3e9a5bf7f56f67b3741f4d70138d5ec1515" exitCode=0 Mar 18 12:15:15 crc kubenswrapper[4975]: I0318 12:15:15.169377 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94vnq" event={"ID":"9002f360-1ea5-4b24-a49a-69f46a658936","Type":"ContainerDied","Data":"aa2086afdd7eaf40d97eb9b328b0b3e9a5bf7f56f67b3741f4d70138d5ec1515"} Mar 18 12:15:15 crc kubenswrapper[4975]: I0318 12:15:15.169406 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94vnq" event={"ID":"9002f360-1ea5-4b24-a49a-69f46a658936","Type":"ContainerDied","Data":"7d0fc2e9673a2da875e7aa58b6317983fed76cb9d3f922b9c3d95b645281ebf9"} Mar 18 12:15:15 crc kubenswrapper[4975]: I0318 12:15:15.169434 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94vnq" Mar 18 12:15:15 crc kubenswrapper[4975]: I0318 12:15:15.169440 4975 scope.go:117] "RemoveContainer" containerID="aa2086afdd7eaf40d97eb9b328b0b3e9a5bf7f56f67b3741f4d70138d5ec1515" Mar 18 12:15:15 crc kubenswrapper[4975]: I0318 12:15:15.184722 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94vnq"] Mar 18 12:15:15 crc kubenswrapper[4975]: I0318 12:15:15.194152 4975 scope.go:117] "RemoveContainer" containerID="3d49650a84899bae1213dd2757d55a47c6a0f5855f1dc7896ee721a5f391962a" Mar 18 12:15:15 crc kubenswrapper[4975]: I0318 12:15:15.194692 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-94vnq"] Mar 18 12:15:15 crc kubenswrapper[4975]: I0318 12:15:15.208635 4975 scope.go:117] "RemoveContainer" containerID="af39051049ba1ad9f3198878892e841f84a0ac608bb1f5e2843077e815f928ef" Mar 18 12:15:15 crc kubenswrapper[4975]: I0318 12:15:15.226998 4975 scope.go:117] "RemoveContainer" containerID="aa2086afdd7eaf40d97eb9b328b0b3e9a5bf7f56f67b3741f4d70138d5ec1515" Mar 18 12:15:15 crc kubenswrapper[4975]: E0318 12:15:15.227455 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa2086afdd7eaf40d97eb9b328b0b3e9a5bf7f56f67b3741f4d70138d5ec1515\": container with ID starting with aa2086afdd7eaf40d97eb9b328b0b3e9a5bf7f56f67b3741f4d70138d5ec1515 not found: ID does not exist" containerID="aa2086afdd7eaf40d97eb9b328b0b3e9a5bf7f56f67b3741f4d70138d5ec1515" Mar 18 12:15:15 crc kubenswrapper[4975]: I0318 12:15:15.227495 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2086afdd7eaf40d97eb9b328b0b3e9a5bf7f56f67b3741f4d70138d5ec1515"} err="failed to get container status \"aa2086afdd7eaf40d97eb9b328b0b3e9a5bf7f56f67b3741f4d70138d5ec1515\": rpc error: code = NotFound desc = could not find container \"aa2086afdd7eaf40d97eb9b328b0b3e9a5bf7f56f67b3741f4d70138d5ec1515\": container with ID starting with aa2086afdd7eaf40d97eb9b328b0b3e9a5bf7f56f67b3741f4d70138d5ec1515 not found: ID does not exist" Mar 18 12:15:15 crc kubenswrapper[4975]: I0318 12:15:15.227522 4975 scope.go:117] "RemoveContainer" containerID="3d49650a84899bae1213dd2757d55a47c6a0f5855f1dc7896ee721a5f391962a" Mar 18 12:15:15 crc kubenswrapper[4975]: E0318 12:15:15.227929 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d49650a84899bae1213dd2757d55a47c6a0f5855f1dc7896ee721a5f391962a\": container with ID starting with 3d49650a84899bae1213dd2757d55a47c6a0f5855f1dc7896ee721a5f391962a not found: ID does not exist" containerID="3d49650a84899bae1213dd2757d55a47c6a0f5855f1dc7896ee721a5f391962a" Mar 18 12:15:15 crc kubenswrapper[4975]: I0318 12:15:15.227950 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d49650a84899bae1213dd2757d55a47c6a0f5855f1dc7896ee721a5f391962a"} err="failed to get container status \"3d49650a84899bae1213dd2757d55a47c6a0f5855f1dc7896ee721a5f391962a\": rpc error: code = NotFound desc = could not find container \"3d49650a84899bae1213dd2757d55a47c6a0f5855f1dc7896ee721a5f391962a\": container with ID starting with 3d49650a84899bae1213dd2757d55a47c6a0f5855f1dc7896ee721a5f391962a not found: ID does not exist" Mar 18 12:15:15 crc kubenswrapper[4975]: I0318 12:15:15.227964 4975 scope.go:117] "RemoveContainer" containerID="af39051049ba1ad9f3198878892e841f84a0ac608bb1f5e2843077e815f928ef" Mar 18 12:15:15 crc kubenswrapper[4975]: E0318 12:15:15.228222 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af39051049ba1ad9f3198878892e841f84a0ac608bb1f5e2843077e815f928ef\": container with ID starting with af39051049ba1ad9f3198878892e841f84a0ac608bb1f5e2843077e815f928ef not found: ID does not exist" containerID="af39051049ba1ad9f3198878892e841f84a0ac608bb1f5e2843077e815f928ef" Mar 18 12:15:15 crc kubenswrapper[4975]: I0318 12:15:15.228243 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af39051049ba1ad9f3198878892e841f84a0ac608bb1f5e2843077e815f928ef"} err="failed to get container status \"af39051049ba1ad9f3198878892e841f84a0ac608bb1f5e2843077e815f928ef\": rpc error: code = NotFound desc = could not find container \"af39051049ba1ad9f3198878892e841f84a0ac608bb1f5e2843077e815f928ef\": container with ID starting with af39051049ba1ad9f3198878892e841f84a0ac608bb1f5e2843077e815f928ef not found: ID does not exist" Mar 18 12:15:17 crc kubenswrapper[4975]: I0318 12:15:17.023694 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9002f360-1ea5-4b24-a49a-69f46a658936" path="/var/lib/kubelet/pods/9002f360-1ea5-4b24-a49a-69f46a658936/volumes" Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.162625 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8bc487549-7nvhs"] Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.162829 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" podUID="9bd18011-0374-4806-96e3-613abb41d92e" containerName="controller-manager" containerID="cri-o://8d653b94b4ea5e3e640061a2407f9d2c388716f877c231b9105945a86360f981" gracePeriod=30 Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.261706 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2"] Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.261955 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" podUID="d2d63d0b-0d29-4614-9c3d-211f6a298d92" containerName="route-controller-manager" containerID="cri-o://9db20d18fe6b2cd8a3fdf13e6f17f6767a6b5572bcf0b1ea915b8946f3e930db" gracePeriod=30 Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.756358 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.791361 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.831231 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2d63d0b-0d29-4614-9c3d-211f6a298d92-client-ca\") pod \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.831286 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d63d0b-0d29-4614-9c3d-211f6a298d92-serving-cert\") pod \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.831347 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d63d0b-0d29-4614-9c3d-211f6a298d92-config\") pod \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.831374 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmrhf\" (UniqueName: \"kubernetes.io/projected/d2d63d0b-0d29-4614-9c3d-211f6a298d92-kube-api-access-bmrhf\") pod \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\" (UID: \"d2d63d0b-0d29-4614-9c3d-211f6a298d92\") " Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.832054 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2d63d0b-0d29-4614-9c3d-211f6a298d92-client-ca" (OuterVolumeSpecName: "client-ca") pod "d2d63d0b-0d29-4614-9c3d-211f6a298d92" (UID: "d2d63d0b-0d29-4614-9c3d-211f6a298d92"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.832063 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2d63d0b-0d29-4614-9c3d-211f6a298d92-config" (OuterVolumeSpecName: "config") pod "d2d63d0b-0d29-4614-9c3d-211f6a298d92" (UID: "d2d63d0b-0d29-4614-9c3d-211f6a298d92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.836724 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d63d0b-0d29-4614-9c3d-211f6a298d92-kube-api-access-bmrhf" (OuterVolumeSpecName: "kube-api-access-bmrhf") pod "d2d63d0b-0d29-4614-9c3d-211f6a298d92" (UID: "d2d63d0b-0d29-4614-9c3d-211f6a298d92"). InnerVolumeSpecName "kube-api-access-bmrhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.836837 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2d63d0b-0d29-4614-9c3d-211f6a298d92-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d2d63d0b-0d29-4614-9c3d-211f6a298d92" (UID: "d2d63d0b-0d29-4614-9c3d-211f6a298d92"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.932205 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-proxy-ca-bundles\") pod \"9bd18011-0374-4806-96e3-613abb41d92e\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.932322 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bd18011-0374-4806-96e3-613abb41d92e-serving-cert\") pod \"9bd18011-0374-4806-96e3-613abb41d92e\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.932381 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-client-ca\") pod \"9bd18011-0374-4806-96e3-613abb41d92e\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.932404 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-config\") pod \"9bd18011-0374-4806-96e3-613abb41d92e\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.932451 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdtdz\" (UniqueName: \"kubernetes.io/projected/9bd18011-0374-4806-96e3-613abb41d92e-kube-api-access-qdtdz\") pod \"9bd18011-0374-4806-96e3-613abb41d92e\" (UID: \"9bd18011-0374-4806-96e3-613abb41d92e\") " Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.933002 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9bd18011-0374-4806-96e3-613abb41d92e" (UID: "9bd18011-0374-4806-96e3-613abb41d92e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.933038 4975 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2d63d0b-0d29-4614-9c3d-211f6a298d92-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.933049 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-config" (OuterVolumeSpecName: "config") pod "9bd18011-0374-4806-96e3-613abb41d92e" (UID: "9bd18011-0374-4806-96e3-613abb41d92e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.933064 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d63d0b-0d29-4614-9c3d-211f6a298d92-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.933083 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d63d0b-0d29-4614-9c3d-211f6a298d92-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.933099 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmrhf\" (UniqueName: \"kubernetes.io/projected/d2d63d0b-0d29-4614-9c3d-211f6a298d92-kube-api-access-bmrhf\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.933121 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-client-ca" (OuterVolumeSpecName: "client-ca") pod "9bd18011-0374-4806-96e3-613abb41d92e" (UID: "9bd18011-0374-4806-96e3-613abb41d92e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.935312 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd18011-0374-4806-96e3-613abb41d92e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9bd18011-0374-4806-96e3-613abb41d92e" (UID: "9bd18011-0374-4806-96e3-613abb41d92e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:18 crc kubenswrapper[4975]: I0318 12:15:18.936086 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bd18011-0374-4806-96e3-613abb41d92e-kube-api-access-qdtdz" (OuterVolumeSpecName: "kube-api-access-qdtdz") pod "9bd18011-0374-4806-96e3-613abb41d92e" (UID: "9bd18011-0374-4806-96e3-613abb41d92e"). InnerVolumeSpecName "kube-api-access-qdtdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.033934 4975 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.033959 4975 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bd18011-0374-4806-96e3-613abb41d92e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.033968 4975 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.033976 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bd18011-0374-4806-96e3-613abb41d92e-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.033984 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdtdz\" (UniqueName: \"kubernetes.io/projected/9bd18011-0374-4806-96e3-613abb41d92e-kube-api-access-qdtdz\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.193554 4975 generic.go:334] "Generic (PLEG): container finished" podID="d2d63d0b-0d29-4614-9c3d-211f6a298d92" containerID="9db20d18fe6b2cd8a3fdf13e6f17f6767a6b5572bcf0b1ea915b8946f3e930db" exitCode=0 Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.193618 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" event={"ID":"d2d63d0b-0d29-4614-9c3d-211f6a298d92","Type":"ContainerDied","Data":"9db20d18fe6b2cd8a3fdf13e6f17f6767a6b5572bcf0b1ea915b8946f3e930db"} Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.193664 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.193707 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2" event={"ID":"d2d63d0b-0d29-4614-9c3d-211f6a298d92","Type":"ContainerDied","Data":"e986cd9886e5161433714c3cc5854b815c4aa70da65b129ca8e6dbc1813d70f9"} Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.193738 4975 scope.go:117] "RemoveContainer" containerID="9db20d18fe6b2cd8a3fdf13e6f17f6767a6b5572bcf0b1ea915b8946f3e930db" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.195950 4975 generic.go:334] "Generic (PLEG): container finished" podID="9bd18011-0374-4806-96e3-613abb41d92e" containerID="8d653b94b4ea5e3e640061a2407f9d2c388716f877c231b9105945a86360f981" exitCode=0 Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.195979 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" event={"ID":"9bd18011-0374-4806-96e3-613abb41d92e","Type":"ContainerDied","Data":"8d653b94b4ea5e3e640061a2407f9d2c388716f877c231b9105945a86360f981"} Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.196001 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" event={"ID":"9bd18011-0374-4806-96e3-613abb41d92e","Type":"ContainerDied","Data":"bf2d8e76018f586064839515f2694801ed43e83e2845453fdf449dc68fbf4d0a"} Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.196035 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8bc487549-7nvhs" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.215589 4975 scope.go:117] "RemoveContainer" containerID="9db20d18fe6b2cd8a3fdf13e6f17f6767a6b5572bcf0b1ea915b8946f3e930db" Mar 18 12:15:19 crc kubenswrapper[4975]: E0318 12:15:19.216123 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9db20d18fe6b2cd8a3fdf13e6f17f6767a6b5572bcf0b1ea915b8946f3e930db\": container with ID starting with 9db20d18fe6b2cd8a3fdf13e6f17f6767a6b5572bcf0b1ea915b8946f3e930db not found: ID does not exist" containerID="9db20d18fe6b2cd8a3fdf13e6f17f6767a6b5572bcf0b1ea915b8946f3e930db" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.216169 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db20d18fe6b2cd8a3fdf13e6f17f6767a6b5572bcf0b1ea915b8946f3e930db"} err="failed to get container status \"9db20d18fe6b2cd8a3fdf13e6f17f6767a6b5572bcf0b1ea915b8946f3e930db\": rpc error: code = NotFound desc = could not find container \"9db20d18fe6b2cd8a3fdf13e6f17f6767a6b5572bcf0b1ea915b8946f3e930db\": container with ID starting with 9db20d18fe6b2cd8a3fdf13e6f17f6767a6b5572bcf0b1ea915b8946f3e930db not found: ID does not exist" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.216198 4975 scope.go:117] "RemoveContainer" containerID="8d653b94b4ea5e3e640061a2407f9d2c388716f877c231b9105945a86360f981" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.220841 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2"] Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.228605 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fbb8b9b87-cdzh2"] Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.232989 4975 scope.go:117] "RemoveContainer" containerID="8d653b94b4ea5e3e640061a2407f9d2c388716f877c231b9105945a86360f981" Mar 18 12:15:19 crc kubenswrapper[4975]: E0318 12:15:19.233335 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d653b94b4ea5e3e640061a2407f9d2c388716f877c231b9105945a86360f981\": container with ID starting with 8d653b94b4ea5e3e640061a2407f9d2c388716f877c231b9105945a86360f981 not found: ID does not exist" containerID="8d653b94b4ea5e3e640061a2407f9d2c388716f877c231b9105945a86360f981" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.233368 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d653b94b4ea5e3e640061a2407f9d2c388716f877c231b9105945a86360f981"} err="failed to get container status \"8d653b94b4ea5e3e640061a2407f9d2c388716f877c231b9105945a86360f981\": rpc error: code = NotFound desc = could not find container \"8d653b94b4ea5e3e640061a2407f9d2c388716f877c231b9105945a86360f981\": container with ID starting with 8d653b94b4ea5e3e640061a2407f9d2c388716f877c231b9105945a86360f981 not found: ID does not exist" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.234220 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8bc487549-7nvhs"] Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.237736 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8bc487549-7nvhs"] Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.370504 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg"] Mar 18 12:15:19 crc kubenswrapper[4975]: E0318 12:15:19.370820 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a0b37e-4423-421c-910e-658cb59e08c8" containerName="extract-utilities" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.370839 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a0b37e-4423-421c-910e-658cb59e08c8" containerName="extract-utilities" Mar 18 12:15:19 crc kubenswrapper[4975]: E0318 12:15:19.370885 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d63d0b-0d29-4614-9c3d-211f6a298d92" containerName="route-controller-manager" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.370894 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d63d0b-0d29-4614-9c3d-211f6a298d92" containerName="route-controller-manager" Mar 18 12:15:19 crc kubenswrapper[4975]: E0318 12:15:19.370906 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9002f360-1ea5-4b24-a49a-69f46a658936" containerName="extract-content" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.370914 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="9002f360-1ea5-4b24-a49a-69f46a658936" containerName="extract-content" Mar 18 12:15:19 crc kubenswrapper[4975]: E0318 12:15:19.370923 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cedb84ff-ae71-4210-8f65-16441f4292ac" containerName="registry-server" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.370931 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="cedb84ff-ae71-4210-8f65-16441f4292ac" containerName="registry-server" Mar 18 12:15:19 crc kubenswrapper[4975]: E0318 12:15:19.370942 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85462245-ccc9-46f5-8bcb-a2648e9f1488" containerName="collect-profiles" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.370949 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="85462245-ccc9-46f5-8bcb-a2648e9f1488" containerName="collect-profiles" Mar 18 12:15:19 crc kubenswrapper[4975]: E0318 12:15:19.370968 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9002f360-1ea5-4b24-a49a-69f46a658936" containerName="registry-server" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.370976 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="9002f360-1ea5-4b24-a49a-69f46a658936" containerName="registry-server" Mar 18 12:15:19 crc kubenswrapper[4975]: E0318 12:15:19.370987 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd18011-0374-4806-96e3-613abb41d92e" containerName="controller-manager" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.370994 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd18011-0374-4806-96e3-613abb41d92e" containerName="controller-manager" Mar 18 12:15:19 crc kubenswrapper[4975]: E0318 12:15:19.371005 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a0b37e-4423-421c-910e-658cb59e08c8" containerName="registry-server" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.371012 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a0b37e-4423-421c-910e-658cb59e08c8" containerName="registry-server" Mar 18 12:15:19 crc kubenswrapper[4975]: E0318 12:15:19.371021 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9002f360-1ea5-4b24-a49a-69f46a658936" containerName="extract-utilities" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.371029 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="9002f360-1ea5-4b24-a49a-69f46a658936" containerName="extract-utilities" Mar 18 12:15:19 crc kubenswrapper[4975]: E0318 12:15:19.371039 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cedb84ff-ae71-4210-8f65-16441f4292ac" containerName="extract-utilities" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.371048 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="cedb84ff-ae71-4210-8f65-16441f4292ac" containerName="extract-utilities" Mar 18 12:15:19 crc kubenswrapper[4975]: E0318 12:15:19.371057 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cedb84ff-ae71-4210-8f65-16441f4292ac" containerName="extract-content" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.371064 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="cedb84ff-ae71-4210-8f65-16441f4292ac" containerName="extract-content" Mar 18 12:15:19 crc kubenswrapper[4975]: E0318 12:15:19.371072 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a0b37e-4423-421c-910e-658cb59e08c8" containerName="extract-content" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.371080 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a0b37e-4423-421c-910e-658cb59e08c8" containerName="extract-content" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.371199 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d63d0b-0d29-4614-9c3d-211f6a298d92" containerName="route-controller-manager" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.371211 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="cedb84ff-ae71-4210-8f65-16441f4292ac" containerName="registry-server" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.371219 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="9002f360-1ea5-4b24-a49a-69f46a658936" containerName="registry-server" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.371231 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="85462245-ccc9-46f5-8bcb-a2648e9f1488" containerName="collect-profiles" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.371241 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd18011-0374-4806-96e3-613abb41d92e" containerName="controller-manager" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.371251 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a0b37e-4423-421c-910e-658cb59e08c8" containerName="registry-server" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.371684 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.374084 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f4678566-kqp7h"] Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.374833 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.375646 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.375730 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.375764 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.375878 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.375991 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.378380 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.378443 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.379548 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.379896 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.380062 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.381217 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.381386 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.392450 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg"] Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.397087 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.402058 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f4678566-kqp7h"] Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.540127 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8cc120e-649c-47a2-96d6-b5369452c85e-client-ca\") pod \"route-controller-manager-57ffb9cc86-qc5lg\" (UID: \"a8cc120e-649c-47a2-96d6-b5369452c85e\") " pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.540202 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe7d49fc-3806-4427-8a86-546e47d33a8c-config\") pod \"controller-manager-6f4678566-kqp7h\" (UID: \"fe7d49fc-3806-4427-8a86-546e47d33a8c\") " pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.540280 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe7d49fc-3806-4427-8a86-546e47d33a8c-serving-cert\") pod \"controller-manager-6f4678566-kqp7h\" (UID: \"fe7d49fc-3806-4427-8a86-546e47d33a8c\") " pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.540402 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe7d49fc-3806-4427-8a86-546e47d33a8c-proxy-ca-bundles\") pod \"controller-manager-6f4678566-kqp7h\" (UID: \"fe7d49fc-3806-4427-8a86-546e47d33a8c\") " pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.540457 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8cc120e-649c-47a2-96d6-b5369452c85e-serving-cert\") pod \"route-controller-manager-57ffb9cc86-qc5lg\" (UID: \"a8cc120e-649c-47a2-96d6-b5369452c85e\") " pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.540495 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lkkm\" (UniqueName: \"kubernetes.io/projected/a8cc120e-649c-47a2-96d6-b5369452c85e-kube-api-access-9lkkm\") pod \"route-controller-manager-57ffb9cc86-qc5lg\" (UID: \"a8cc120e-649c-47a2-96d6-b5369452c85e\") " pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.540564 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvjws\" (UniqueName: \"kubernetes.io/projected/fe7d49fc-3806-4427-8a86-546e47d33a8c-kube-api-access-qvjws\") pod \"controller-manager-6f4678566-kqp7h\" (UID: \"fe7d49fc-3806-4427-8a86-546e47d33a8c\") " pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.540589 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe7d49fc-3806-4427-8a86-546e47d33a8c-client-ca\") pod \"controller-manager-6f4678566-kqp7h\" (UID: \"fe7d49fc-3806-4427-8a86-546e47d33a8c\") " pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.540615 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8cc120e-649c-47a2-96d6-b5369452c85e-config\") pod \"route-controller-manager-57ffb9cc86-qc5lg\" (UID: \"a8cc120e-649c-47a2-96d6-b5369452c85e\") " pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.642238 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe7d49fc-3806-4427-8a86-546e47d33a8c-serving-cert\") pod \"controller-manager-6f4678566-kqp7h\" (UID: \"fe7d49fc-3806-4427-8a86-546e47d33a8c\") " pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.642314 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe7d49fc-3806-4427-8a86-546e47d33a8c-proxy-ca-bundles\") pod \"controller-manager-6f4678566-kqp7h\" (UID: \"fe7d49fc-3806-4427-8a86-546e47d33a8c\") " pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.642347 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8cc120e-649c-47a2-96d6-b5369452c85e-serving-cert\") pod \"route-controller-manager-57ffb9cc86-qc5lg\" (UID: \"a8cc120e-649c-47a2-96d6-b5369452c85e\") " pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.642559 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lkkm\" (UniqueName: \"kubernetes.io/projected/a8cc120e-649c-47a2-96d6-b5369452c85e-kube-api-access-9lkkm\") pod \"route-controller-manager-57ffb9cc86-qc5lg\" (UID: \"a8cc120e-649c-47a2-96d6-b5369452c85e\") " pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.642586 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvjws\" (UniqueName: \"kubernetes.io/projected/fe7d49fc-3806-4427-8a86-546e47d33a8c-kube-api-access-qvjws\") pod \"controller-manager-6f4678566-kqp7h\" (UID: \"fe7d49fc-3806-4427-8a86-546e47d33a8c\") " pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.642604 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe7d49fc-3806-4427-8a86-546e47d33a8c-client-ca\") pod \"controller-manager-6f4678566-kqp7h\" (UID: \"fe7d49fc-3806-4427-8a86-546e47d33a8c\") " pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.642629 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8cc120e-649c-47a2-96d6-b5369452c85e-config\") pod \"route-controller-manager-57ffb9cc86-qc5lg\" (UID: \"a8cc120e-649c-47a2-96d6-b5369452c85e\") " pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.642656 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8cc120e-649c-47a2-96d6-b5369452c85e-client-ca\") pod \"route-controller-manager-57ffb9cc86-qc5lg\" (UID: \"a8cc120e-649c-47a2-96d6-b5369452c85e\") " pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.642674 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe7d49fc-3806-4427-8a86-546e47d33a8c-config\") pod \"controller-manager-6f4678566-kqp7h\" (UID: \"fe7d49fc-3806-4427-8a86-546e47d33a8c\") " pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.644063 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe7d49fc-3806-4427-8a86-546e47d33a8c-proxy-ca-bundles\") pod \"controller-manager-6f4678566-kqp7h\" (UID: \"fe7d49fc-3806-4427-8a86-546e47d33a8c\") " pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.644188 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8cc120e-649c-47a2-96d6-b5369452c85e-config\") pod \"route-controller-manager-57ffb9cc86-qc5lg\" (UID: \"a8cc120e-649c-47a2-96d6-b5369452c85e\") " pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.644214 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe7d49fc-3806-4427-8a86-546e47d33a8c-client-ca\") pod \"controller-manager-6f4678566-kqp7h\" (UID: \"fe7d49fc-3806-4427-8a86-546e47d33a8c\") " pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.644226 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8cc120e-649c-47a2-96d6-b5369452c85e-client-ca\") pod \"route-controller-manager-57ffb9cc86-qc5lg\" (UID: \"a8cc120e-649c-47a2-96d6-b5369452c85e\") " pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.644294 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe7d49fc-3806-4427-8a86-546e47d33a8c-config\") pod \"controller-manager-6f4678566-kqp7h\" (UID: \"fe7d49fc-3806-4427-8a86-546e47d33a8c\") " pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.647122 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe7d49fc-3806-4427-8a86-546e47d33a8c-serving-cert\") pod \"controller-manager-6f4678566-kqp7h\" (UID: \"fe7d49fc-3806-4427-8a86-546e47d33a8c\") " pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.647125 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8cc120e-649c-47a2-96d6-b5369452c85e-serving-cert\") pod \"route-controller-manager-57ffb9cc86-qc5lg\" (UID: \"a8cc120e-649c-47a2-96d6-b5369452c85e\") " pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.660779 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lkkm\" (UniqueName: \"kubernetes.io/projected/a8cc120e-649c-47a2-96d6-b5369452c85e-kube-api-access-9lkkm\") pod \"route-controller-manager-57ffb9cc86-qc5lg\" (UID: \"a8cc120e-649c-47a2-96d6-b5369452c85e\") " pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.660794 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvjws\" (UniqueName: \"kubernetes.io/projected/fe7d49fc-3806-4427-8a86-546e47d33a8c-kube-api-access-qvjws\") pod \"controller-manager-6f4678566-kqp7h\" (UID: \"fe7d49fc-3806-4427-8a86-546e47d33a8c\") " pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.690047 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:19 crc kubenswrapper[4975]: I0318 12:15:19.705733 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.113900 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mg2g2"] Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.131027 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg"] Mar 18 12:15:20 crc kubenswrapper[4975]: W0318 12:15:20.138320 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8cc120e_649c_47a2_96d6_b5369452c85e.slice/crio-ece0893c6e433fb6e5fa2339b4e1fc26bdb85f3ac513e1db5ad48ced8a60c3c2 WatchSource:0}: Error finding container ece0893c6e433fb6e5fa2339b4e1fc26bdb85f3ac513e1db5ad48ced8a60c3c2: Status 404 returned error can't find the container with id ece0893c6e433fb6e5fa2339b4e1fc26bdb85f3ac513e1db5ad48ced8a60c3c2 Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.207119 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f4678566-kqp7h"] Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.213025 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" event={"ID":"a8cc120e-649c-47a2-96d6-b5369452c85e","Type":"ContainerStarted","Data":"ece0893c6e433fb6e5fa2339b4e1fc26bdb85f3ac513e1db5ad48ced8a60c3c2"} Mar 18 12:15:20 crc kubenswrapper[4975]: W0318 12:15:20.219073 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe7d49fc_3806_4427_8a86_546e47d33a8c.slice/crio-7e30232a1e73e0ee323262386a3f4a16283d9a592d7109da6c3137f50900fa53 WatchSource:0}: Error finding container 7e30232a1e73e0ee323262386a3f4a16283d9a592d7109da6c3137f50900fa53: Status 404 returned error can't find the container with id 7e30232a1e73e0ee323262386a3f4a16283d9a592d7109da6c3137f50900fa53 Mar 18 12:15:20 crc kubenswrapper[4975]: E0318 12:15:20.984528 4975 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.987619 4975 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988206 4975 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988232 4975 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 12:15:20 crc kubenswrapper[4975]: E0318 12:15:20.988330 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988345 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:20 crc kubenswrapper[4975]: E0318 12:15:20.988355 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988361 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:20 crc kubenswrapper[4975]: E0318 12:15:20.988370 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988376 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:20 crc kubenswrapper[4975]: E0318 12:15:20.988384 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988390 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:20 crc kubenswrapper[4975]: E0318 12:15:20.988398 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988404 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 12:15:20 crc kubenswrapper[4975]: E0318 12:15:20.988413 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988419 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 12:15:20 crc kubenswrapper[4975]: E0318 12:15:20.988427 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988433 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 12:15:20 crc kubenswrapper[4975]: E0318 12:15:20.988442 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988447 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 12:15:20 crc kubenswrapper[4975]: E0318 12:15:20.988457 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988463 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988541 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988551 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988558 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988567 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988574 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988581 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988590 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988597 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988605 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:20 crc kubenswrapper[4975]: E0318 12:15:20.988687 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.988694 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.990734 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.990896 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6" gracePeriod=15 Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.990960 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e" gracePeriod=15 Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.990956 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4" gracePeriod=15 Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.991009 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a" gracePeriod=15 Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.991096 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482" gracePeriod=15 Mar 18 12:15:20 crc kubenswrapper[4975]: I0318 12:15:20.996658 4975 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.026658 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bd18011-0374-4806-96e3-613abb41d92e" path="/var/lib/kubelet/pods/9bd18011-0374-4806-96e3-613abb41d92e/volumes" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.027383 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d63d0b-0d29-4614-9c3d-211f6a298d92" path="/var/lib/kubelet/pods/d2d63d0b-0d29-4614-9c3d-211f6a298d92/volumes" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.047614 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.064194 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.064236 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.064285 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.064356 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.064408 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.165297 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.165469 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.165678 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.165703 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.165720 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.165753 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.165820 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.165881 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.166035 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.166069 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.166098 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.166202 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.166206 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.223206 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.224676 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.225756 4975 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e" exitCode=0 Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.225787 4975 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482" exitCode=0 Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.225797 4975 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4" exitCode=0 Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.225806 4975 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a" exitCode=2 Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.225879 4975 scope.go:117] "RemoveContainer" containerID="44a5fd7f2d7507523bec29f3f4b5b9d54976267a81ceec7b6767f2b3918f3ffc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.228665 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" event={"ID":"fe7d49fc-3806-4427-8a86-546e47d33a8c","Type":"ContainerStarted","Data":"c138ecc100f9647d39eb21039377601ab5d1b749f46311d868c917ba574c4131"} Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.228701 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" event={"ID":"fe7d49fc-3806-4427-8a86-546e47d33a8c","Type":"ContainerStarted","Data":"7e30232a1e73e0ee323262386a3f4a16283d9a592d7109da6c3137f50900fa53"} Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.229332 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.229627 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.229817 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.231079 4975 generic.go:334] "Generic (PLEG): container finished" podID="975a4029-93d2-4e67-ab56-85d09a7af50b" containerID="037e9b4692bfceb2ad0c287cb8845e9c3a855199831d1071471606a14e62560c" exitCode=0 Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.231136 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"975a4029-93d2-4e67-ab56-85d09a7af50b","Type":"ContainerDied","Data":"037e9b4692bfceb2ad0c287cb8845e9c3a855199831d1071471606a14e62560c"} Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.231517 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.231992 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.232272 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.232927 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" event={"ID":"a8cc120e-649c-47a2-96d6-b5369452c85e","Type":"ContainerStarted","Data":"989087ce75715e723198683404e0cad8b40d338d1c89522bf5c23d25e278252e"} Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.233262 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.233391 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.233637 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.233999 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.234409 4975 status_manager.go:851] "Failed to get status for pod" podUID="a8cc120e-649c-47a2-96d6-b5369452c85e" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-57ffb9cc86-qc5lg\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.234845 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.235176 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.235486 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.235918 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.236137 4975 status_manager.go:851] "Failed to get status for pod" podUID="a8cc120e-649c-47a2-96d6-b5369452c85e" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-57ffb9cc86-qc5lg\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.239018 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.239279 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.239431 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.239579 4975 status_manager.go:851] "Failed to get status for pod" podUID="a8cc120e-649c-47a2-96d6-b5369452c85e" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-57ffb9cc86-qc5lg\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.239722 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.267554 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.267637 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.267658 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.267715 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.267748 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.267768 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: I0318 12:15:21.344531 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:15:21 crc kubenswrapper[4975]: W0318 12:15:21.372683 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-3b53e1faa25c796dc9e8433c9afeeb311f11995c375776f9827dd174cdad09d9 WatchSource:0}: Error finding container 3b53e1faa25c796dc9e8433c9afeeb311f11995c375776f9827dd174cdad09d9: Status 404 returned error can't find the container with id 3b53e1faa25c796dc9e8433c9afeeb311f11995c375776f9827dd174cdad09d9 Mar 18 12:15:21 crc kubenswrapper[4975]: E0318 12:15:21.379358 4975 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.9:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189dee8de085d372 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:15:21.378509682 +0000 UTC m=+307.092910261,LastTimestamp:2026-03-18 12:15:21.378509682 +0000 UTC m=+307.092910261,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.240457 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"8b7049473cef1ae114b1cb94f9ff4f8a9f55974379b35fbab41650ce5758bf30"} Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.240853 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3b53e1faa25c796dc9e8433c9afeeb311f11995c375776f9827dd174cdad09d9"} Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.241582 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.241944 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.242216 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.242508 4975 status_manager.go:851] "Failed to get status for pod" podUID="a8cc120e-649c-47a2-96d6-b5369452c85e" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-57ffb9cc86-qc5lg\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.245972 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.521396 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.522219 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.522546 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.522923 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.523160 4975 status_manager.go:851] "Failed to get status for pod" podUID="a8cc120e-649c-47a2-96d6-b5369452c85e" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-57ffb9cc86-qc5lg\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.685171 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/975a4029-93d2-4e67-ab56-85d09a7af50b-kubelet-dir\") pod \"975a4029-93d2-4e67-ab56-85d09a7af50b\" (UID: \"975a4029-93d2-4e67-ab56-85d09a7af50b\") " Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.685293 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/975a4029-93d2-4e67-ab56-85d09a7af50b-var-lock\") pod \"975a4029-93d2-4e67-ab56-85d09a7af50b\" (UID: \"975a4029-93d2-4e67-ab56-85d09a7af50b\") " Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.685350 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/975a4029-93d2-4e67-ab56-85d09a7af50b-kube-api-access\") pod \"975a4029-93d2-4e67-ab56-85d09a7af50b\" (UID: \"975a4029-93d2-4e67-ab56-85d09a7af50b\") " Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.685618 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/975a4029-93d2-4e67-ab56-85d09a7af50b-var-lock" (OuterVolumeSpecName: "var-lock") pod "975a4029-93d2-4e67-ab56-85d09a7af50b" (UID: "975a4029-93d2-4e67-ab56-85d09a7af50b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.685881 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/975a4029-93d2-4e67-ab56-85d09a7af50b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "975a4029-93d2-4e67-ab56-85d09a7af50b" (UID: "975a4029-93d2-4e67-ab56-85d09a7af50b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.689708 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/975a4029-93d2-4e67-ab56-85d09a7af50b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "975a4029-93d2-4e67-ab56-85d09a7af50b" (UID: "975a4029-93d2-4e67-ab56-85d09a7af50b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.786787 4975 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/975a4029-93d2-4e67-ab56-85d09a7af50b-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.786833 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/975a4029-93d2-4e67-ab56-85d09a7af50b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:22 crc kubenswrapper[4975]: I0318 12:15:22.786852 4975 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/975a4029-93d2-4e67-ab56-85d09a7af50b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:22 crc kubenswrapper[4975]: E0318 12:15:22.826291 4975 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.9:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189dee8de085d372 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:15:21.378509682 +0000 UTC m=+307.092910261,LastTimestamp:2026-03-18 12:15:21.378509682 +0000 UTC m=+307.092910261,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.254679 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"975a4029-93d2-4e67-ab56-85d09a7af50b","Type":"ContainerDied","Data":"8f2dc49430fdac740e898200bfc611b95166bdb526b422e713cd2718d9f41658"} Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.254744 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f2dc49430fdac740e898200bfc611b95166bdb526b422e713cd2718d9f41658" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.254774 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.264107 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.264357 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.264555 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.264762 4975 status_manager.go:851] "Failed to get status for pod" podUID="a8cc120e-649c-47a2-96d6-b5369452c85e" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-57ffb9cc86-qc5lg\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.358144 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.358678 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.359317 4975 status_manager.go:851] "Failed to get status for pod" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" pod="openshift-marketplace/redhat-marketplace-f75bb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-f75bb\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.359573 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.359715 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.359971 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.360227 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.360476 4975 status_manager.go:851] "Failed to get status for pod" podUID="a8cc120e-649c-47a2-96d6-b5369452c85e" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-57ffb9cc86-qc5lg\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.360757 4975 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.361028 4975 status_manager.go:851] "Failed to get status for pod" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" pod="openshift-marketplace/redhat-marketplace-f75bb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-f75bb\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.361299 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.361550 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.361834 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.362098 4975 status_manager.go:851] "Failed to get status for pod" podUID="a8cc120e-649c-47a2-96d6-b5369452c85e" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-57ffb9cc86-qc5lg\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.494344 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.494385 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.494415 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.494648 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.494651 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.494727 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.595746 4975 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.595785 4975 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:23 crc kubenswrapper[4975]: I0318 12:15:23.595795 4975 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.262821 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.263581 4975 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6" exitCode=0 Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.263657 4975 scope.go:117] "RemoveContainer" containerID="0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.263705 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.283103 4975 status_manager.go:851] "Failed to get status for pod" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" pod="openshift-marketplace/redhat-marketplace-f75bb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-f75bb\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.283763 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.284525 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.284985 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.285115 4975 scope.go:117] "RemoveContainer" containerID="fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.285167 4975 status_manager.go:851] "Failed to get status for pod" podUID="a8cc120e-649c-47a2-96d6-b5369452c85e" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-57ffb9cc86-qc5lg\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.285326 4975 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.299730 4975 scope.go:117] "RemoveContainer" containerID="a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.313296 4975 scope.go:117] "RemoveContainer" containerID="819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.338077 4975 scope.go:117] "RemoveContainer" containerID="a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.355502 4975 scope.go:117] "RemoveContainer" containerID="7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.390748 4975 scope.go:117] "RemoveContainer" containerID="0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e" Mar 18 12:15:24 crc kubenswrapper[4975]: E0318 12:15:24.392906 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\": container with ID starting with 0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e not found: ID does not exist" containerID="0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.392964 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e"} err="failed to get container status \"0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\": rpc error: code = NotFound desc = could not find container \"0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e\": container with ID starting with 0771f6e34886ce7be6bc3b728431a6d6138032790a7794121581c3818876f99e not found: ID does not exist" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.392995 4975 scope.go:117] "RemoveContainer" containerID="fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482" Mar 18 12:15:24 crc kubenswrapper[4975]: E0318 12:15:24.393429 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\": container with ID starting with fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482 not found: ID does not exist" containerID="fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.393455 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482"} err="failed to get container status \"fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\": rpc error: code = NotFound desc = could not find container \"fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482\": container with ID starting with fc3f85436d6f0df1c6ea9faf137becf01ccba4b502e4f65e9d78e53b4aabd482 not found: ID does not exist" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.393471 4975 scope.go:117] "RemoveContainer" containerID="a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4" Mar 18 12:15:24 crc kubenswrapper[4975]: E0318 12:15:24.394499 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\": container with ID starting with a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4 not found: ID does not exist" containerID="a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.394564 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4"} err="failed to get container status \"a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\": rpc error: code = NotFound desc = could not find container \"a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4\": container with ID starting with a7456066659a23fb88ab1ffb515830f5cb805cd7308695d2d0435ca261c00ab4 not found: ID does not exist" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.394582 4975 scope.go:117] "RemoveContainer" containerID="819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a" Mar 18 12:15:24 crc kubenswrapper[4975]: E0318 12:15:24.395112 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\": container with ID starting with 819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a not found: ID does not exist" containerID="819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.395180 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a"} err="failed to get container status \"819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\": rpc error: code = NotFound desc = could not find container \"819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a\": container with ID starting with 819be51ed2c847fa60ec2fc73de9fff2c9ecc9938dcaa65a6006711318b1e11a not found: ID does not exist" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.395220 4975 scope.go:117] "RemoveContainer" containerID="a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6" Mar 18 12:15:24 crc kubenswrapper[4975]: E0318 12:15:24.395734 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\": container with ID starting with a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6 not found: ID does not exist" containerID="a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.395770 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6"} err="failed to get container status \"a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\": rpc error: code = NotFound desc = could not find container \"a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6\": container with ID starting with a3a46decb453ee6954274bb4929fa444e5284da03b7515398ecdcea298cfd3a6 not found: ID does not exist" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.395793 4975 scope.go:117] "RemoveContainer" containerID="7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e" Mar 18 12:15:24 crc kubenswrapper[4975]: E0318 12:15:24.396207 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\": container with ID starting with 7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e not found: ID does not exist" containerID="7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e" Mar 18 12:15:24 crc kubenswrapper[4975]: I0318 12:15:24.396229 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e"} err="failed to get container status \"7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\": rpc error: code = NotFound desc = could not find container \"7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e\": container with ID starting with 7d683a174e6760b71e9a6deb3952a87c72641163d581c1756c5dc2aceed2173e not found: ID does not exist" Mar 18 12:15:25 crc kubenswrapper[4975]: I0318 12:15:25.018979 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:25 crc kubenswrapper[4975]: I0318 12:15:25.019424 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:25 crc kubenswrapper[4975]: I0318 12:15:25.019705 4975 status_manager.go:851] "Failed to get status for pod" podUID="a8cc120e-649c-47a2-96d6-b5369452c85e" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-57ffb9cc86-qc5lg\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:25 crc kubenswrapper[4975]: I0318 12:15:25.019951 4975 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:25 crc kubenswrapper[4975]: I0318 12:15:25.020116 4975 status_manager.go:851] "Failed to get status for pod" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" pod="openshift-marketplace/redhat-marketplace-f75bb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-f75bb\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:25 crc kubenswrapper[4975]: I0318 12:15:25.020304 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:25 crc kubenswrapper[4975]: I0318 12:15:25.022715 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 18 12:15:30 crc kubenswrapper[4975]: E0318 12:15:30.482544 4975 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:30 crc kubenswrapper[4975]: E0318 12:15:30.484667 4975 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:30 crc kubenswrapper[4975]: E0318 12:15:30.485689 4975 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:30 crc kubenswrapper[4975]: E0318 12:15:30.486790 4975 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:30 crc kubenswrapper[4975]: E0318 12:15:30.487182 4975 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:30 crc kubenswrapper[4975]: I0318 12:15:30.487317 4975 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 18 12:15:30 crc kubenswrapper[4975]: E0318 12:15:30.487748 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.9:6443: connect: connection refused" interval="200ms" Mar 18 12:15:30 crc kubenswrapper[4975]: E0318 12:15:30.689128 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.9:6443: connect: connection refused" interval="400ms" Mar 18 12:15:31 crc kubenswrapper[4975]: E0318 12:15:31.089922 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.9:6443: connect: connection refused" interval="800ms" Mar 18 12:15:31 crc kubenswrapper[4975]: E0318 12:15:31.891003 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.9:6443: connect: connection refused" interval="1.6s" Mar 18 12:15:32 crc kubenswrapper[4975]: E0318 12:15:32.828042 4975 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.9:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189dee8de085d372 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 12:15:21.378509682 +0000 UTC m=+307.092910261,LastTimestamp:2026-03-18 12:15:21.378509682 +0000 UTC m=+307.092910261,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 12:15:33 crc kubenswrapper[4975]: E0318 12:15:33.492084 4975 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.9:6443: connect: connection refused" interval="3.2s" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.016379 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.017690 4975 status_manager.go:851] "Failed to get status for pod" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" pod="openshift-marketplace/redhat-marketplace-f75bb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-f75bb\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.018185 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.018454 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.018660 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.018856 4975 status_manager.go:851] "Failed to get status for pod" podUID="a8cc120e-649c-47a2-96d6-b5369452c85e" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-57ffb9cc86-qc5lg\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.035951 4975 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c374ff0-a569-44d6-a341-482a4cf71b70" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.035987 4975 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c374ff0-a569-44d6-a341-482a4cf71b70" Mar 18 12:15:34 crc kubenswrapper[4975]: E0318 12:15:34.036514 4975 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.036954 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:34 crc kubenswrapper[4975]: W0318 12:15:34.058348 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-99a7f99544f03d1377abf4bf6f178ad6be7f8db0a182c8844998b517ab69fd8e WatchSource:0}: Error finding container 99a7f99544f03d1377abf4bf6f178ad6be7f8db0a182c8844998b517ab69fd8e: Status 404 returned error can't find the container with id 99a7f99544f03d1377abf4bf6f178ad6be7f8db0a182c8844998b517ab69fd8e Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.321050 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"99a7f99544f03d1377abf4bf6f178ad6be7f8db0a182c8844998b517ab69fd8e"} Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.323896 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.324677 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.324736 4975 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a" exitCode=1 Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.324776 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a"} Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.325323 4975 scope.go:117] "RemoveContainer" containerID="798454b532a3067c649acc0c06fb5228660cfcfee74c2a47182434203623128a" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.325630 4975 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.326022 4975 status_manager.go:851] "Failed to get status for pod" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" pod="openshift-marketplace/redhat-marketplace-f75bb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-f75bb\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.326262 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.326616 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.326840 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:34 crc kubenswrapper[4975]: I0318 12:15:34.327057 4975 status_manager.go:851] "Failed to get status for pod" podUID="a8cc120e-649c-47a2-96d6-b5369452c85e" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-57ffb9cc86-qc5lg\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.021306 4975 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.021741 4975 status_manager.go:851] "Failed to get status for pod" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" pod="openshift-marketplace/redhat-marketplace-f75bb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-f75bb\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.022299 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.022726 4975 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.023004 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.023229 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.023436 4975 status_manager.go:851] "Failed to get status for pod" podUID="a8cc120e-649c-47a2-96d6-b5369452c85e" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-57ffb9cc86-qc5lg\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.333065 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.333564 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.333661 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2f2d643f8298782cd444ab10bffa5c271da052fa84940cb7a4d9f6cc6e40c397"} Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.334428 4975 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.334711 4975 status_manager.go:851] "Failed to get status for pod" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" pod="openshift-marketplace/redhat-marketplace-f75bb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-f75bb\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.335160 4975 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="76955a8ebfac58ba26f4aad86fb36ce7d34a90c446b11c5f00c24ef63ac2ae49" exitCode=0 Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.335200 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"76955a8ebfac58ba26f4aad86fb36ce7d34a90c446b11c5f00c24ef63ac2ae49"} Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.335218 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.335509 4975 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c374ff0-a569-44d6-a341-482a4cf71b70" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.335569 4975 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c374ff0-a569-44d6-a341-482a4cf71b70" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.335621 4975 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.335914 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.336273 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: E0318 12:15:35.336412 4975 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.336508 4975 status_manager.go:851] "Failed to get status for pod" podUID="a8cc120e-649c-47a2-96d6-b5369452c85e" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-57ffb9cc86-qc5lg\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.337137 4975 status_manager.go:851] "Failed to get status for pod" podUID="fe7d49fc-3806-4427-8a86-546e47d33a8c" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f4678566-kqp7h\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.337380 4975 status_manager.go:851] "Failed to get status for pod" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.337674 4975 status_manager.go:851] "Failed to get status for pod" podUID="a8cc120e-649c-47a2-96d6-b5369452c85e" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-57ffb9cc86-qc5lg\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.337930 4975 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.338246 4975 status_manager.go:851] "Failed to get status for pod" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" pod="openshift-marketplace/redhat-marketplace-f75bb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-f75bb\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.338602 4975 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:35 crc kubenswrapper[4975]: I0318 12:15:35.338941 4975 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.9:6443: connect: connection refused" Mar 18 12:15:36 crc kubenswrapper[4975]: I0318 12:15:36.348753 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"19367779b2ee503121fa03ce5c48b7f76c8dd153d398da856ae807a84bed5451"} Mar 18 12:15:36 crc kubenswrapper[4975]: I0318 12:15:36.349153 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"98b8ecb925ac969f1a7206ee6afd032ae9d77b92d0f32175c84e4d386e5b406d"} Mar 18 12:15:36 crc kubenswrapper[4975]: I0318 12:15:36.349171 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9fce9486b23ea57f0bf224834035018ebfdfd2862f812cf3d7d261c0448814e3"} Mar 18 12:15:36 crc kubenswrapper[4975]: I0318 12:15:36.349181 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bd08545f3115c7981c89d401c3a769a39247da8a49989d523103c3239cbd96a5"} Mar 18 12:15:36 crc kubenswrapper[4975]: I0318 12:15:36.349191 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"98a9e6b4d0e62bfa261186410d2b110c4baf82c2289cbca9585843fd54694807"} Mar 18 12:15:36 crc kubenswrapper[4975]: I0318 12:15:36.349182 4975 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c374ff0-a569-44d6-a341-482a4cf71b70" Mar 18 12:15:36 crc kubenswrapper[4975]: I0318 12:15:36.349213 4975 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c374ff0-a569-44d6-a341-482a4cf71b70" Mar 18 12:15:36 crc kubenswrapper[4975]: I0318 12:15:36.349617 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:37 crc kubenswrapper[4975]: I0318 12:15:37.456460 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:15:39 crc kubenswrapper[4975]: I0318 12:15:39.038185 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:39 crc kubenswrapper[4975]: I0318 12:15:39.038544 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:39 crc kubenswrapper[4975]: I0318 12:15:39.046661 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:39 crc kubenswrapper[4975]: I0318 12:15:39.107928 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:15:39 crc kubenswrapper[4975]: I0318 12:15:39.111606 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:15:40 crc kubenswrapper[4975]: I0318 12:15:40.009089 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:15:40 crc kubenswrapper[4975]: I0318 12:15:40.009251 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:15:40 crc kubenswrapper[4975]: I0318 12:15:40.011295 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 12:15:40 crc kubenswrapper[4975]: I0318 12:15:40.011369 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 12:15:40 crc kubenswrapper[4975]: I0318 12:15:40.021124 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:15:40 crc kubenswrapper[4975]: I0318 12:15:40.027324 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:15:40 crc kubenswrapper[4975]: I0318 12:15:40.032939 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 12:15:40 crc kubenswrapper[4975]: I0318 12:15:40.115528 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:15:40 crc kubenswrapper[4975]: I0318 12:15:40.115762 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:15:40 crc kubenswrapper[4975]: I0318 12:15:40.117466 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 12:15:40 crc kubenswrapper[4975]: I0318 12:15:40.128658 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 12:15:40 crc kubenswrapper[4975]: I0318 12:15:40.144607 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:15:40 crc kubenswrapper[4975]: I0318 12:15:40.148590 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:15:40 crc kubenswrapper[4975]: I0318 12:15:40.342499 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:15:40 crc kubenswrapper[4975]: I0318 12:15:40.348432 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 12:15:40 crc kubenswrapper[4975]: W0318 12:15:40.503176 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-ac4f13358d1615a0c0c39356fd8bca6bcaa6764676468dce0f34a19c4b4edc38 WatchSource:0}: Error finding container ac4f13358d1615a0c0c39356fd8bca6bcaa6764676468dce0f34a19c4b4edc38: Status 404 returned error can't find the container with id ac4f13358d1615a0c0c39356fd8bca6bcaa6764676468dce0f34a19c4b4edc38 Mar 18 12:15:40 crc kubenswrapper[4975]: W0318 12:15:40.780385 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-35bd1ab6ca90ffd009db54a8089462643621c214d58417f4695abf3b8a04d1b5 WatchSource:0}: Error finding container 35bd1ab6ca90ffd009db54a8089462643621c214d58417f4695abf3b8a04d1b5: Status 404 returned error can't find the container with id 35bd1ab6ca90ffd009db54a8089462643621c214d58417f4695abf3b8a04d1b5 Mar 18 12:15:40 crc kubenswrapper[4975]: W0318 12:15:40.850953 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-ce436406c30fd91cb3275fe029e9cad8208d18d58299f59e49266ea9f5abd4e8 WatchSource:0}: Error finding container ce436406c30fd91cb3275fe029e9cad8208d18d58299f59e49266ea9f5abd4e8: Status 404 returned error can't find the container with id ce436406c30fd91cb3275fe029e9cad8208d18d58299f59e49266ea9f5abd4e8 Mar 18 12:15:41 crc kubenswrapper[4975]: I0318 12:15:41.383315 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"113a05385e513ec34b0e8db4b65445aa0a5752ab2650f5df59e19c317d5af647"} Mar 18 12:15:41 crc kubenswrapper[4975]: I0318 12:15:41.383371 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ce436406c30fd91cb3275fe029e9cad8208d18d58299f59e49266ea9f5abd4e8"} Mar 18 12:15:41 crc kubenswrapper[4975]: I0318 12:15:41.383521 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:15:41 crc kubenswrapper[4975]: I0318 12:15:41.386015 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"cbb67ed370a965dbec9fbb7f5a0f8fce1cf02834565757cadd1521b731653c7e"} Mar 18 12:15:41 crc kubenswrapper[4975]: I0318 12:15:41.386048 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ac4f13358d1615a0c0c39356fd8bca6bcaa6764676468dce0f34a19c4b4edc38"} Mar 18 12:15:41 crc kubenswrapper[4975]: I0318 12:15:41.387739 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2c3e9d67b17b91be0890bd4d88bf83a3451036e15d5bc6aa85b4fe075af55cdd"} Mar 18 12:15:41 crc kubenswrapper[4975]: I0318 12:15:41.387791 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"35bd1ab6ca90ffd009db54a8089462643621c214d58417f4695abf3b8a04d1b5"} Mar 18 12:15:41 crc kubenswrapper[4975]: I0318 12:15:41.943927 4975 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:42 crc kubenswrapper[4975]: I0318 12:15:42.078448 4975 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a5c7cd38-ff6f-4102-9b66-5970f524cac0" Mar 18 12:15:42 crc kubenswrapper[4975]: I0318 12:15:42.399234 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 18 12:15:42 crc kubenswrapper[4975]: I0318 12:15:42.399313 4975 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="2c3e9d67b17b91be0890bd4d88bf83a3451036e15d5bc6aa85b4fe075af55cdd" exitCode=255 Mar 18 12:15:42 crc kubenswrapper[4975]: I0318 12:15:42.399419 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"2c3e9d67b17b91be0890bd4d88bf83a3451036e15d5bc6aa85b4fe075af55cdd"} Mar 18 12:15:42 crc kubenswrapper[4975]: I0318 12:15:42.399697 4975 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c374ff0-a569-44d6-a341-482a4cf71b70" Mar 18 12:15:42 crc kubenswrapper[4975]: I0318 12:15:42.399718 4975 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c374ff0-a569-44d6-a341-482a4cf71b70" Mar 18 12:15:42 crc kubenswrapper[4975]: I0318 12:15:42.400170 4975 scope.go:117] "RemoveContainer" containerID="2c3e9d67b17b91be0890bd4d88bf83a3451036e15d5bc6aa85b4fe075af55cdd" Mar 18 12:15:42 crc kubenswrapper[4975]: I0318 12:15:42.405378 4975 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a5c7cd38-ff6f-4102-9b66-5970f524cac0" Mar 18 12:15:42 crc kubenswrapper[4975]: I0318 12:15:42.407926 4975 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://98a9e6b4d0e62bfa261186410d2b110c4baf82c2289cbca9585843fd54694807" Mar 18 12:15:42 crc kubenswrapper[4975]: I0318 12:15:42.410099 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:15:43 crc kubenswrapper[4975]: I0318 12:15:43.406935 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 18 12:15:43 crc kubenswrapper[4975]: I0318 12:15:43.407275 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"11e54722965cc4c9942ec1c784a81cd9bc46c145a177bf79205b6fcd690c8875"} Mar 18 12:15:43 crc kubenswrapper[4975]: I0318 12:15:43.407538 4975 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c374ff0-a569-44d6-a341-482a4cf71b70" Mar 18 12:15:43 crc kubenswrapper[4975]: I0318 12:15:43.407563 4975 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c374ff0-a569-44d6-a341-482a4cf71b70" Mar 18 12:15:43 crc kubenswrapper[4975]: I0318 12:15:43.411742 4975 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a5c7cd38-ff6f-4102-9b66-5970f524cac0" Mar 18 12:15:44 crc kubenswrapper[4975]: I0318 12:15:44.413571 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 12:15:44 crc kubenswrapper[4975]: I0318 12:15:44.414292 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 18 12:15:44 crc kubenswrapper[4975]: I0318 12:15:44.414339 4975 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="11e54722965cc4c9942ec1c784a81cd9bc46c145a177bf79205b6fcd690c8875" exitCode=255 Mar 18 12:15:44 crc kubenswrapper[4975]: I0318 12:15:44.414380 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"11e54722965cc4c9942ec1c784a81cd9bc46c145a177bf79205b6fcd690c8875"} Mar 18 12:15:44 crc kubenswrapper[4975]: I0318 12:15:44.414423 4975 scope.go:117] "RemoveContainer" containerID="2c3e9d67b17b91be0890bd4d88bf83a3451036e15d5bc6aa85b4fe075af55cdd" Mar 18 12:15:44 crc kubenswrapper[4975]: I0318 12:15:44.414922 4975 scope.go:117] "RemoveContainer" containerID="11e54722965cc4c9942ec1c784a81cd9bc46c145a177bf79205b6fcd690c8875" Mar 18 12:15:44 crc kubenswrapper[4975]: E0318 12:15:44.415139 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.136074 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" podUID="14d667ef-8c80-42f5-b119-1bae87e39be7" containerName="oauth-openshift" containerID="cri-o://5b7e49473cceaec36d25015f5dcfbf3a8cf47274103d7196a3e4b9321efb90e9" gracePeriod=15 Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.421222 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.423079 4975 generic.go:334] "Generic (PLEG): container finished" podID="14d667ef-8c80-42f5-b119-1bae87e39be7" containerID="5b7e49473cceaec36d25015f5dcfbf3a8cf47274103d7196a3e4b9321efb90e9" exitCode=0 Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.423124 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" event={"ID":"14d667ef-8c80-42f5-b119-1bae87e39be7","Type":"ContainerDied","Data":"5b7e49473cceaec36d25015f5dcfbf3a8cf47274103d7196a3e4b9321efb90e9"} Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.641150 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.797877 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbnhx\" (UniqueName: \"kubernetes.io/projected/14d667ef-8c80-42f5-b119-1bae87e39be7-kube-api-access-vbnhx\") pod \"14d667ef-8c80-42f5-b119-1bae87e39be7\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.797932 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-login\") pod \"14d667ef-8c80-42f5-b119-1bae87e39be7\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.797959 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-audit-policies\") pod \"14d667ef-8c80-42f5-b119-1bae87e39be7\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.798007 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-cliconfig\") pod \"14d667ef-8c80-42f5-b119-1bae87e39be7\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.798031 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-serving-cert\") pod \"14d667ef-8c80-42f5-b119-1bae87e39be7\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.798070 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-provider-selection\") pod \"14d667ef-8c80-42f5-b119-1bae87e39be7\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.798109 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-session\") pod \"14d667ef-8c80-42f5-b119-1bae87e39be7\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.798134 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-trusted-ca-bundle\") pod \"14d667ef-8c80-42f5-b119-1bae87e39be7\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.798160 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-idp-0-file-data\") pod \"14d667ef-8c80-42f5-b119-1bae87e39be7\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.798192 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-router-certs\") pod \"14d667ef-8c80-42f5-b119-1bae87e39be7\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.798225 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-service-ca\") pod \"14d667ef-8c80-42f5-b119-1bae87e39be7\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.798246 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14d667ef-8c80-42f5-b119-1bae87e39be7-audit-dir\") pod \"14d667ef-8c80-42f5-b119-1bae87e39be7\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.798277 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-error\") pod \"14d667ef-8c80-42f5-b119-1bae87e39be7\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.798303 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-ocp-branding-template\") pod \"14d667ef-8c80-42f5-b119-1bae87e39be7\" (UID: \"14d667ef-8c80-42f5-b119-1bae87e39be7\") " Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.799038 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14d667ef-8c80-42f5-b119-1bae87e39be7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "14d667ef-8c80-42f5-b119-1bae87e39be7" (UID: "14d667ef-8c80-42f5-b119-1bae87e39be7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.799483 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "14d667ef-8c80-42f5-b119-1bae87e39be7" (UID: "14d667ef-8c80-42f5-b119-1bae87e39be7"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.800101 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "14d667ef-8c80-42f5-b119-1bae87e39be7" (UID: "14d667ef-8c80-42f5-b119-1bae87e39be7"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.800641 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "14d667ef-8c80-42f5-b119-1bae87e39be7" (UID: "14d667ef-8c80-42f5-b119-1bae87e39be7"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.801342 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "14d667ef-8c80-42f5-b119-1bae87e39be7" (UID: "14d667ef-8c80-42f5-b119-1bae87e39be7"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.803584 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "14d667ef-8c80-42f5-b119-1bae87e39be7" (UID: "14d667ef-8c80-42f5-b119-1bae87e39be7"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.803781 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "14d667ef-8c80-42f5-b119-1bae87e39be7" (UID: "14d667ef-8c80-42f5-b119-1bae87e39be7"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.804267 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "14d667ef-8c80-42f5-b119-1bae87e39be7" (UID: "14d667ef-8c80-42f5-b119-1bae87e39be7"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.805071 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14d667ef-8c80-42f5-b119-1bae87e39be7-kube-api-access-vbnhx" (OuterVolumeSpecName: "kube-api-access-vbnhx") pod "14d667ef-8c80-42f5-b119-1bae87e39be7" (UID: "14d667ef-8c80-42f5-b119-1bae87e39be7"). InnerVolumeSpecName "kube-api-access-vbnhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.805241 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "14d667ef-8c80-42f5-b119-1bae87e39be7" (UID: "14d667ef-8c80-42f5-b119-1bae87e39be7"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.805251 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "14d667ef-8c80-42f5-b119-1bae87e39be7" (UID: "14d667ef-8c80-42f5-b119-1bae87e39be7"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.805403 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "14d667ef-8c80-42f5-b119-1bae87e39be7" (UID: "14d667ef-8c80-42f5-b119-1bae87e39be7"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.814198 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "14d667ef-8c80-42f5-b119-1bae87e39be7" (UID: "14d667ef-8c80-42f5-b119-1bae87e39be7"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.815286 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "14d667ef-8c80-42f5-b119-1bae87e39be7" (UID: "14d667ef-8c80-42f5-b119-1bae87e39be7"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.900165 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbnhx\" (UniqueName: \"kubernetes.io/projected/14d667ef-8c80-42f5-b119-1bae87e39be7-kube-api-access-vbnhx\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.900214 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.900235 4975 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.900248 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.900262 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.900274 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.900287 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.900298 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.900312 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.900325 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.900339 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.900352 4975 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/14d667ef-8c80-42f5-b119-1bae87e39be7-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.900363 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:45 crc kubenswrapper[4975]: I0318 12:15:45.900375 4975 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/14d667ef-8c80-42f5-b119-1bae87e39be7-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 12:15:46 crc kubenswrapper[4975]: I0318 12:15:46.431519 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" event={"ID":"14d667ef-8c80-42f5-b119-1bae87e39be7","Type":"ContainerDied","Data":"23d23ca8beff4fc3bea616878d12c4925791e02ed9ab2c28e65b83c63e82c90c"} Mar 18 12:15:46 crc kubenswrapper[4975]: I0318 12:15:46.431613 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mg2g2" Mar 18 12:15:46 crc kubenswrapper[4975]: I0318 12:15:46.431620 4975 scope.go:117] "RemoveContainer" containerID="5b7e49473cceaec36d25015f5dcfbf3a8cf47274103d7196a3e4b9321efb90e9" Mar 18 12:15:47 crc kubenswrapper[4975]: I0318 12:15:47.459792 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 12:15:53 crc kubenswrapper[4975]: I0318 12:15:53.157126 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 12:15:53 crc kubenswrapper[4975]: I0318 12:15:53.465386 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 12:15:53 crc kubenswrapper[4975]: I0318 12:15:53.547521 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 12:15:53 crc kubenswrapper[4975]: I0318 12:15:53.573278 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 12:15:53 crc kubenswrapper[4975]: I0318 12:15:53.672957 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 12:15:53 crc kubenswrapper[4975]: I0318 12:15:53.748259 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 12:15:53 crc kubenswrapper[4975]: I0318 12:15:53.858340 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 12:15:53 crc kubenswrapper[4975]: I0318 12:15:53.911554 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 12:15:54 crc kubenswrapper[4975]: I0318 12:15:54.045409 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 12:15:54 crc kubenswrapper[4975]: I0318 12:15:54.051556 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 12:15:54 crc kubenswrapper[4975]: I0318 12:15:54.294078 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 12:15:54 crc kubenswrapper[4975]: I0318 12:15:54.422853 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 12:15:54 crc kubenswrapper[4975]: I0318 12:15:54.561605 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 12:15:54 crc kubenswrapper[4975]: I0318 12:15:54.675241 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 12:15:54 crc kubenswrapper[4975]: I0318 12:15:54.748817 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 12:15:54 crc kubenswrapper[4975]: I0318 12:15:54.774776 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 12:15:54 crc kubenswrapper[4975]: I0318 12:15:54.775832 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 12:15:54 crc kubenswrapper[4975]: I0318 12:15:54.832924 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 12:15:54 crc kubenswrapper[4975]: I0318 12:15:54.858217 4975 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 12:15:54 crc kubenswrapper[4975]: I0318 12:15:54.868984 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 12:15:54 crc kubenswrapper[4975]: I0318 12:15:54.901321 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 12:15:54 crc kubenswrapper[4975]: I0318 12:15:54.915251 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 12:15:55 crc kubenswrapper[4975]: I0318 12:15:55.086657 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 12:15:55 crc kubenswrapper[4975]: I0318 12:15:55.132738 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 12:15:55 crc kubenswrapper[4975]: I0318 12:15:55.132962 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 12:15:55 crc kubenswrapper[4975]: I0318 12:15:55.286115 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 12:15:55 crc kubenswrapper[4975]: I0318 12:15:55.353527 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 12:15:55 crc kubenswrapper[4975]: I0318 12:15:55.471354 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 12:15:55 crc kubenswrapper[4975]: I0318 12:15:55.518145 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 12:15:55 crc kubenswrapper[4975]: I0318 12:15:55.604669 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 12:15:55 crc kubenswrapper[4975]: I0318 12:15:55.670220 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 12:15:55 crc kubenswrapper[4975]: I0318 12:15:55.822087 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 12:15:55 crc kubenswrapper[4975]: I0318 12:15:55.852946 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 12:15:55 crc kubenswrapper[4975]: I0318 12:15:55.914919 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 12:15:56 crc kubenswrapper[4975]: I0318 12:15:56.016046 4975 scope.go:117] "RemoveContainer" containerID="11e54722965cc4c9942ec1c784a81cd9bc46c145a177bf79205b6fcd690c8875" Mar 18 12:15:56 crc kubenswrapper[4975]: I0318 12:15:56.073204 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 12:15:56 crc kubenswrapper[4975]: I0318 12:15:56.120297 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 12:15:56 crc kubenswrapper[4975]: I0318 12:15:56.130338 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 12:15:56 crc kubenswrapper[4975]: I0318 12:15:56.206806 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 12:15:56 crc kubenswrapper[4975]: I0318 12:15:56.280672 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 12:15:56 crc kubenswrapper[4975]: I0318 12:15:56.338753 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 12:15:56 crc kubenswrapper[4975]: I0318 12:15:56.378839 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 12:15:56 crc kubenswrapper[4975]: I0318 12:15:56.440677 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 12:15:56 crc kubenswrapper[4975]: I0318 12:15:56.473244 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 12:15:56 crc kubenswrapper[4975]: I0318 12:15:56.493621 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 12:15:56 crc kubenswrapper[4975]: I0318 12:15:56.493681 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a38e9aaea8b95e0b076fbf891040bba4c8533e1aced2872e16687882d0277be6"} Mar 18 12:15:56 crc kubenswrapper[4975]: I0318 12:15:56.629922 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 12:15:56 crc kubenswrapper[4975]: I0318 12:15:56.768636 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 12:15:56 crc kubenswrapper[4975]: I0318 12:15:56.943820 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.173469 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.269400 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.370270 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.412386 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.500365 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.500954 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.500996 4975 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="a38e9aaea8b95e0b076fbf891040bba4c8533e1aced2872e16687882d0277be6" exitCode=255 Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.501026 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"a38e9aaea8b95e0b076fbf891040bba4c8533e1aced2872e16687882d0277be6"} Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.501059 4975 scope.go:117] "RemoveContainer" containerID="11e54722965cc4c9942ec1c784a81cd9bc46c145a177bf79205b6fcd690c8875" Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.501502 4975 scope.go:117] "RemoveContainer" containerID="a38e9aaea8b95e0b076fbf891040bba4c8533e1aced2872e16687882d0277be6" Mar 18 12:15:57 crc kubenswrapper[4975]: E0318 12:15:57.501713 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.582375 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.782302 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.782977 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.789884 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.790614 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 12:15:57 crc kubenswrapper[4975]: I0318 12:15:57.832731 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.016761 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.125843 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.235048 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.270467 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.271647 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.280907 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.286386 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.312241 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.318531 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.352177 4975 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.365277 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.458450 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.486401 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.490699 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.505719 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.510554 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.654912 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.725821 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.800581 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.831131 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.857448 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.882457 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.970281 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 12:15:58 crc kubenswrapper[4975]: I0318 12:15:58.999578 4975 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.000919 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.025924 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.054620 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.066656 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.066713 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.079996 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.121733 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.135057 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.210235 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.217707 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.314822 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.330036 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.344015 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.383423 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.404983 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.418716 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.492573 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.495658 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.517605 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.521639 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.559105 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.593478 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.603796 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.721747 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.784588 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.945640 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 12:15:59 crc kubenswrapper[4975]: I0318 12:15:59.945682 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.228154 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.377251 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.454821 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.472731 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.577626 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.616101 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.627204 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.757622 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.761471 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.815836 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.823312 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.847349 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.861749 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.904157 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.946806 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 12:16:00 crc kubenswrapper[4975]: I0318 12:16:00.987196 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.058710 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.127311 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.169425 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.403040 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.409911 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.450065 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.473245 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.529028 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.564612 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.588929 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.627127 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.628503 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.683810 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.699191 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.714441 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.819195 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.846561 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.913034 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.922349 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.954455 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 12:16:01 crc kubenswrapper[4975]: I0318 12:16:01.994481 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.038526 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.277120 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.395379 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.396037 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.427669 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.458411 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.489845 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.526558 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.596477 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.650043 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.666372 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.674965 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.698763 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.736779 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.750167 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.767398 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.931344 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 12:16:02 crc kubenswrapper[4975]: I0318 12:16:02.940035 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.027907 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.285587 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.297986 4975 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.366113 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.427168 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.577550 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.619680 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.676242 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.696964 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.743452 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.766617 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.784598 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.867439 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.871232 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.905754 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.941963 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 12:16:03 crc kubenswrapper[4975]: I0318 12:16:03.964109 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 12:16:04 crc kubenswrapper[4975]: I0318 12:16:04.096704 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 12:16:04 crc kubenswrapper[4975]: I0318 12:16:04.155797 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 12:16:04 crc kubenswrapper[4975]: I0318 12:16:04.206970 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 12:16:04 crc kubenswrapper[4975]: I0318 12:16:04.276589 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 12:16:04 crc kubenswrapper[4975]: I0318 12:16:04.444167 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 12:16:04 crc kubenswrapper[4975]: I0318 12:16:04.469068 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 12:16:04 crc kubenswrapper[4975]: I0318 12:16:04.499392 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 12:16:04 crc kubenswrapper[4975]: I0318 12:16:04.636886 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 12:16:04 crc kubenswrapper[4975]: I0318 12:16:04.644369 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 12:16:04 crc kubenswrapper[4975]: I0318 12:16:04.677684 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 12:16:04 crc kubenswrapper[4975]: I0318 12:16:04.701542 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 12:16:04 crc kubenswrapper[4975]: I0318 12:16:04.826042 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 12:16:04 crc kubenswrapper[4975]: I0318 12:16:04.984299 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 12:16:05 crc kubenswrapper[4975]: I0318 12:16:05.052023 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 12:16:05 crc kubenswrapper[4975]: I0318 12:16:05.179421 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 12:16:05 crc kubenswrapper[4975]: I0318 12:16:05.199394 4975 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 12:16:05 crc kubenswrapper[4975]: I0318 12:16:05.202898 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 12:16:05 crc kubenswrapper[4975]: I0318 12:16:05.468911 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 12:16:05 crc kubenswrapper[4975]: I0318 12:16:05.540076 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 12:16:05 crc kubenswrapper[4975]: I0318 12:16:05.619924 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 12:16:05 crc kubenswrapper[4975]: I0318 12:16:05.631049 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 12:16:05 crc kubenswrapper[4975]: I0318 12:16:05.732967 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 12:16:05 crc kubenswrapper[4975]: I0318 12:16:05.942444 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 12:16:06 crc kubenswrapper[4975]: I0318 12:16:06.250967 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 12:16:06 crc kubenswrapper[4975]: I0318 12:16:06.252323 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 12:16:06 crc kubenswrapper[4975]: I0318 12:16:06.253326 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 12:16:06 crc kubenswrapper[4975]: I0318 12:16:06.331533 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 12:16:06 crc kubenswrapper[4975]: I0318 12:16:06.375239 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 12:16:06 crc kubenswrapper[4975]: I0318 12:16:06.397317 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 12:16:06 crc kubenswrapper[4975]: I0318 12:16:06.397667 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 12:16:06 crc kubenswrapper[4975]: I0318 12:16:06.407363 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 12:16:06 crc kubenswrapper[4975]: I0318 12:16:06.433585 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 12:16:06 crc kubenswrapper[4975]: I0318 12:16:06.448067 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 12:16:06 crc kubenswrapper[4975]: I0318 12:16:06.510246 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 12:16:06 crc kubenswrapper[4975]: I0318 12:16:06.742004 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 12:16:06 crc kubenswrapper[4975]: I0318 12:16:06.816424 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 12:16:06 crc kubenswrapper[4975]: I0318 12:16:06.962851 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 12:16:07 crc kubenswrapper[4975]: I0318 12:16:07.042895 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 12:16:07 crc kubenswrapper[4975]: I0318 12:16:07.084768 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 12:16:07 crc kubenswrapper[4975]: I0318 12:16:07.193928 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 12:16:07 crc kubenswrapper[4975]: I0318 12:16:07.224938 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 12:16:07 crc kubenswrapper[4975]: I0318 12:16:07.238763 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 12:16:07 crc kubenswrapper[4975]: I0318 12:16:07.380194 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 12:16:07 crc kubenswrapper[4975]: I0318 12:16:07.632672 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 12:16:07 crc kubenswrapper[4975]: I0318 12:16:07.725258 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 12:16:07 crc kubenswrapper[4975]: I0318 12:16:07.903300 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 12:16:07 crc kubenswrapper[4975]: I0318 12:16:07.904470 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 12:16:07 crc kubenswrapper[4975]: I0318 12:16:07.947751 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 12:16:08 crc kubenswrapper[4975]: I0318 12:16:08.166006 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 12:16:09 crc kubenswrapper[4975]: I0318 12:16:09.016002 4975 scope.go:117] "RemoveContainer" containerID="a38e9aaea8b95e0b076fbf891040bba4c8533e1aced2872e16687882d0277be6" Mar 18 12:16:09 crc kubenswrapper[4975]: E0318 12:16:09.016187 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 12:16:09 crc kubenswrapper[4975]: I0318 12:16:09.096714 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 12:16:09 crc kubenswrapper[4975]: I0318 12:16:09.104908 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 12:16:09 crc kubenswrapper[4975]: I0318 12:16:09.315428 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 12:16:09 crc kubenswrapper[4975]: I0318 12:16:09.523656 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 12:16:09 crc kubenswrapper[4975]: I0318 12:16:09.867699 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.197473 4975 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.201427 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f4678566-kqp7h" podStartSLOduration=52.201403806 podStartE2EDuration="52.201403806s" podCreationTimestamp="2026-03-18 12:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:15:42.092147204 +0000 UTC m=+327.806547783" watchObservedRunningTime="2026-03-18 12:16:10.201403806 +0000 UTC m=+355.915804415" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.202828 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57ffb9cc86-qc5lg" podStartSLOduration=52.202817924 podStartE2EDuration="52.202817924s" podCreationTimestamp="2026-03-18 12:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:15:42.113780088 +0000 UTC m=+327.828180667" watchObservedRunningTime="2026-03-18 12:16:10.202817924 +0000 UTC m=+355.917218533" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.204559 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=49.204552491 podStartE2EDuration="49.204552491s" podCreationTimestamp="2026-03-18 12:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:15:42.07680841 +0000 UTC m=+327.791208989" watchObservedRunningTime="2026-03-18 12:16:10.204552491 +0000 UTC m=+355.918953110" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.205161 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-mg2g2"] Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.205231 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9","openshift-infra/auto-csr-approver-29563936-4kt6x","openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 12:16:10 crc kubenswrapper[4975]: E0318 12:16:10.205527 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d667ef-8c80-42f5-b119-1bae87e39be7" containerName="oauth-openshift" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.205555 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d667ef-8c80-42f5-b119-1bae87e39be7" containerName="oauth-openshift" Mar 18 12:16:10 crc kubenswrapper[4975]: E0318 12:16:10.205577 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" containerName="installer" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.205590 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" containerName="installer" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.205635 4975 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c374ff0-a569-44d6-a341-482a4cf71b70" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.205660 4975 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c374ff0-a569-44d6-a341-482a4cf71b70" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.205764 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="14d667ef-8c80-42f5-b119-1bae87e39be7" containerName="oauth-openshift" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.205791 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="975a4029-93d2-4e67-ab56-85d09a7af50b" containerName="installer" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.206483 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.207040 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563936-4kt6x" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.209683 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.209781 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.209905 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.210174 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.211169 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.211271 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.211470 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.211538 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.211604 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.211799 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.212339 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.212672 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.212847 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.213346 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.214511 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.214730 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.220046 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.227245 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233257 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233320 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233385 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233440 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-user-template-error\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233499 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233536 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-user-template-login\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233585 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlm7b\" (UniqueName: \"kubernetes.io/projected/649231fb-eaf6-45ff-9393-3df37c421381-kube-api-access-mlm7b\") pod \"auto-csr-approver-29563936-4kt6x\" (UID: \"649231fb-eaf6-45ff-9393-3df37c421381\") " pod="openshift-infra/auto-csr-approver-29563936-4kt6x" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233646 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf732f8d-3072-4e85-a294-01d713b3588d-audit-policies\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233670 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf732f8d-3072-4e85-a294-01d713b3588d-audit-dir\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233713 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkdh7\" (UniqueName: \"kubernetes.io/projected/bf732f8d-3072-4e85-a294-01d713b3588d-kube-api-access-mkdh7\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233742 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-router-certs\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233761 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233776 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233811 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-service-ca\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233839 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-session\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.233882 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.238668 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=29.238646902 podStartE2EDuration="29.238646902s" podCreationTimestamp="2026-03-18 12:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:16:10.230649926 +0000 UTC m=+355.945050515" watchObservedRunningTime="2026-03-18 12:16:10.238646902 +0000 UTC m=+355.953047481" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.334502 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-session\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.334555 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.334580 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.334604 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.334631 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-user-template-error\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.334647 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.334664 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-user-template-login\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.334685 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlm7b\" (UniqueName: \"kubernetes.io/projected/649231fb-eaf6-45ff-9393-3df37c421381-kube-api-access-mlm7b\") pod \"auto-csr-approver-29563936-4kt6x\" (UID: \"649231fb-eaf6-45ff-9393-3df37c421381\") " pod="openshift-infra/auto-csr-approver-29563936-4kt6x" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.334707 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf732f8d-3072-4e85-a294-01d713b3588d-audit-policies\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.334723 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf732f8d-3072-4e85-a294-01d713b3588d-audit-dir\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.334746 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkdh7\" (UniqueName: \"kubernetes.io/projected/bf732f8d-3072-4e85-a294-01d713b3588d-kube-api-access-mkdh7\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.334764 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-router-certs\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.334778 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.334794 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.334808 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-service-ca\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.335489 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-service-ca\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.336034 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf732f8d-3072-4e85-a294-01d713b3588d-audit-policies\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.336330 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf732f8d-3072-4e85-a294-01d713b3588d-audit-dir\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.336572 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.340634 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-router-certs\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.340686 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.341137 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.342311 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.342720 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-user-template-error\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.343433 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-session\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.344540 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.345529 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.346006 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf732f8d-3072-4e85-a294-01d713b3588d-v4-0-config-user-template-login\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.352537 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlm7b\" (UniqueName: \"kubernetes.io/projected/649231fb-eaf6-45ff-9393-3df37c421381-kube-api-access-mlm7b\") pod \"auto-csr-approver-29563936-4kt6x\" (UID: \"649231fb-eaf6-45ff-9393-3df37c421381\") " pod="openshift-infra/auto-csr-approver-29563936-4kt6x" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.352952 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkdh7\" (UniqueName: \"kubernetes.io/projected/bf732f8d-3072-4e85-a294-01d713b3588d-kube-api-access-mkdh7\") pod \"oauth-openshift-5584c6b7fb-7fpf9\" (UID: \"bf732f8d-3072-4e85-a294-01d713b3588d\") " pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.527032 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.538147 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563936-4kt6x" Mar 18 12:16:10 crc kubenswrapper[4975]: I0318 12:16:10.945095 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563936-4kt6x"] Mar 18 12:16:11 crc kubenswrapper[4975]: I0318 12:16:11.024445 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14d667ef-8c80-42f5-b119-1bae87e39be7" path="/var/lib/kubelet/pods/14d667ef-8c80-42f5-b119-1bae87e39be7/volumes" Mar 18 12:16:11 crc kubenswrapper[4975]: I0318 12:16:11.025118 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9"] Mar 18 12:16:11 crc kubenswrapper[4975]: W0318 12:16:11.026291 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf732f8d_3072_4e85_a294_01d713b3588d.slice/crio-557d1e18c737e8c1b963f594742ccefadbf63e04b6737ff979c1efc68867b906 WatchSource:0}: Error finding container 557d1e18c737e8c1b963f594742ccefadbf63e04b6737ff979c1efc68867b906: Status 404 returned error can't find the container with id 557d1e18c737e8c1b963f594742ccefadbf63e04b6737ff979c1efc68867b906 Mar 18 12:16:11 crc kubenswrapper[4975]: I0318 12:16:11.580014 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" event={"ID":"bf732f8d-3072-4e85-a294-01d713b3588d","Type":"ContainerStarted","Data":"b7663b8a1642ca19829587f2e35ce51e3a71b6560fd084998c06ad57a8361e34"} Mar 18 12:16:11 crc kubenswrapper[4975]: I0318 12:16:11.580094 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" event={"ID":"bf732f8d-3072-4e85-a294-01d713b3588d","Type":"ContainerStarted","Data":"557d1e18c737e8c1b963f594742ccefadbf63e04b6737ff979c1efc68867b906"} Mar 18 12:16:11 crc kubenswrapper[4975]: I0318 12:16:11.580121 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:11 crc kubenswrapper[4975]: I0318 12:16:11.581261 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563936-4kt6x" event={"ID":"649231fb-eaf6-45ff-9393-3df37c421381","Type":"ContainerStarted","Data":"c698c41d255ec1ee93f6f308a7ebec496271edf77951406eeda0f2eb77a27ae7"} Mar 18 12:16:11 crc kubenswrapper[4975]: I0318 12:16:11.599420 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" podStartSLOduration=51.599403542 podStartE2EDuration="51.599403542s" podCreationTimestamp="2026-03-18 12:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:16:11.599122325 +0000 UTC m=+357.313522924" watchObservedRunningTime="2026-03-18 12:16:11.599403542 +0000 UTC m=+357.313804121" Mar 18 12:16:11 crc kubenswrapper[4975]: I0318 12:16:11.898325 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5584c6b7fb-7fpf9" Mar 18 12:16:14 crc kubenswrapper[4975]: I0318 12:16:14.600348 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563936-4kt6x" event={"ID":"649231fb-eaf6-45ff-9393-3df37c421381","Type":"ContainerStarted","Data":"20a35710d4e8f5e8d3b3a617c84ab866426452116db57c4b93153314d70f1d2a"} Mar 18 12:16:14 crc kubenswrapper[4975]: I0318 12:16:14.613897 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563936-4kt6x" podStartSLOduration=11.309615774 podStartE2EDuration="14.61387534s" podCreationTimestamp="2026-03-18 12:16:00 +0000 UTC" firstStartedPulling="2026-03-18 12:16:10.952577189 +0000 UTC m=+356.666977768" lastFinishedPulling="2026-03-18 12:16:14.256836735 +0000 UTC m=+359.971237334" observedRunningTime="2026-03-18 12:16:14.6131188 +0000 UTC m=+360.327519379" watchObservedRunningTime="2026-03-18 12:16:14.61387534 +0000 UTC m=+360.328275929" Mar 18 12:16:15 crc kubenswrapper[4975]: I0318 12:16:15.608899 4975 generic.go:334] "Generic (PLEG): container finished" podID="649231fb-eaf6-45ff-9393-3df37c421381" containerID="20a35710d4e8f5e8d3b3a617c84ab866426452116db57c4b93153314d70f1d2a" exitCode=0 Mar 18 12:16:15 crc kubenswrapper[4975]: I0318 12:16:15.608953 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563936-4kt6x" event={"ID":"649231fb-eaf6-45ff-9393-3df37c421381","Type":"ContainerDied","Data":"20a35710d4e8f5e8d3b3a617c84ab866426452116db57c4b93153314d70f1d2a"} Mar 18 12:16:15 crc kubenswrapper[4975]: I0318 12:16:15.972104 4975 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 12:16:15 crc kubenswrapper[4975]: I0318 12:16:15.972356 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://8b7049473cef1ae114b1cb94f9ff4f8a9f55974379b35fbab41650ce5758bf30" gracePeriod=5 Mar 18 12:16:16 crc kubenswrapper[4975]: I0318 12:16:16.958624 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563936-4kt6x" Mar 18 12:16:17 crc kubenswrapper[4975]: I0318 12:16:17.031535 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlm7b\" (UniqueName: \"kubernetes.io/projected/649231fb-eaf6-45ff-9393-3df37c421381-kube-api-access-mlm7b\") pod \"649231fb-eaf6-45ff-9393-3df37c421381\" (UID: \"649231fb-eaf6-45ff-9393-3df37c421381\") " Mar 18 12:16:17 crc kubenswrapper[4975]: I0318 12:16:17.037907 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/649231fb-eaf6-45ff-9393-3df37c421381-kube-api-access-mlm7b" (OuterVolumeSpecName: "kube-api-access-mlm7b") pod "649231fb-eaf6-45ff-9393-3df37c421381" (UID: "649231fb-eaf6-45ff-9393-3df37c421381"). InnerVolumeSpecName "kube-api-access-mlm7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:16:17 crc kubenswrapper[4975]: I0318 12:16:17.133419 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlm7b\" (UniqueName: \"kubernetes.io/projected/649231fb-eaf6-45ff-9393-3df37c421381-kube-api-access-mlm7b\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:17 crc kubenswrapper[4975]: I0318 12:16:17.620676 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563936-4kt6x" event={"ID":"649231fb-eaf6-45ff-9393-3df37c421381","Type":"ContainerDied","Data":"c698c41d255ec1ee93f6f308a7ebec496271edf77951406eeda0f2eb77a27ae7"} Mar 18 12:16:17 crc kubenswrapper[4975]: I0318 12:16:17.620715 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c698c41d255ec1ee93f6f308a7ebec496271edf77951406eeda0f2eb77a27ae7" Mar 18 12:16:17 crc kubenswrapper[4975]: I0318 12:16:17.620766 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563936-4kt6x" Mar 18 12:16:20 crc kubenswrapper[4975]: I0318 12:16:20.367991 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.569762 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.570095 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.583428 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.583470 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.583493 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.583543 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.583573 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.583811 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.584300 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.584371 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.584391 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.590984 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.641531 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.641575 4975 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="8b7049473cef1ae114b1cb94f9ff4f8a9f55974379b35fbab41650ce5758bf30" exitCode=137 Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.641615 4975 scope.go:117] "RemoveContainer" containerID="8b7049473cef1ae114b1cb94f9ff4f8a9f55974379b35fbab41650ce5758bf30" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.641707 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.659785 4975 scope.go:117] "RemoveContainer" containerID="8b7049473cef1ae114b1cb94f9ff4f8a9f55974379b35fbab41650ce5758bf30" Mar 18 12:16:21 crc kubenswrapper[4975]: E0318 12:16:21.660194 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b7049473cef1ae114b1cb94f9ff4f8a9f55974379b35fbab41650ce5758bf30\": container with ID starting with 8b7049473cef1ae114b1cb94f9ff4f8a9f55974379b35fbab41650ce5758bf30 not found: ID does not exist" containerID="8b7049473cef1ae114b1cb94f9ff4f8a9f55974379b35fbab41650ce5758bf30" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.660225 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b7049473cef1ae114b1cb94f9ff4f8a9f55974379b35fbab41650ce5758bf30"} err="failed to get container status \"8b7049473cef1ae114b1cb94f9ff4f8a9f55974379b35fbab41650ce5758bf30\": rpc error: code = NotFound desc = could not find container \"8b7049473cef1ae114b1cb94f9ff4f8a9f55974379b35fbab41650ce5758bf30\": container with ID starting with 8b7049473cef1ae114b1cb94f9ff4f8a9f55974379b35fbab41650ce5758bf30 not found: ID does not exist" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.685014 4975 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.685048 4975 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.685058 4975 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.685067 4975 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:21 crc kubenswrapper[4975]: I0318 12:16:21.685076 4975 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 12:16:23 crc kubenswrapper[4975]: I0318 12:16:23.017221 4975 scope.go:117] "RemoveContainer" containerID="a38e9aaea8b95e0b076fbf891040bba4c8533e1aced2872e16687882d0277be6" Mar 18 12:16:23 crc kubenswrapper[4975]: I0318 12:16:23.022528 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 18 12:16:23 crc kubenswrapper[4975]: I0318 12:16:23.022749 4975 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 18 12:16:23 crc kubenswrapper[4975]: I0318 12:16:23.032797 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 12:16:23 crc kubenswrapper[4975]: I0318 12:16:23.033118 4975 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="04cd3f2c-54b3-46e6-9c47-e21011effc2c" Mar 18 12:16:23 crc kubenswrapper[4975]: I0318 12:16:23.036494 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 12:16:23 crc kubenswrapper[4975]: I0318 12:16:23.036541 4975 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="04cd3f2c-54b3-46e6-9c47-e21011effc2c" Mar 18 12:16:23 crc kubenswrapper[4975]: I0318 12:16:23.660438 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 18 12:16:23 crc kubenswrapper[4975]: I0318 12:16:23.660495 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b0343b8508c3e6050a3c1bbd0fcf0bf8debc712f2262186e4d895edda29f6e2d"} Mar 18 12:16:30 crc kubenswrapper[4975]: I0318 12:16:30.696155 4975 generic.go:334] "Generic (PLEG): container finished" podID="d989095d-7ce2-4dd7-ac9e-5c747e900a61" containerID="3f2f31a4621d02dcf84343d3adb32877b55252d630db0a176c7cdfad689d74a1" exitCode=0 Mar 18 12:16:30 crc kubenswrapper[4975]: I0318 12:16:30.696235 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" event={"ID":"d989095d-7ce2-4dd7-ac9e-5c747e900a61","Type":"ContainerDied","Data":"3f2f31a4621d02dcf84343d3adb32877b55252d630db0a176c7cdfad689d74a1"} Mar 18 12:16:30 crc kubenswrapper[4975]: I0318 12:16:30.697306 4975 scope.go:117] "RemoveContainer" containerID="3f2f31a4621d02dcf84343d3adb32877b55252d630db0a176c7cdfad689d74a1" Mar 18 12:16:30 crc kubenswrapper[4975]: I0318 12:16:30.892155 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:16:30 crc kubenswrapper[4975]: I0318 12:16:30.892528 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:16:31 crc kubenswrapper[4975]: I0318 12:16:31.702929 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" event={"ID":"d989095d-7ce2-4dd7-ac9e-5c747e900a61","Type":"ContainerStarted","Data":"ef6a0860d7ed55f90458e047ef4d9682a5a26de6b578d07e92ea3e43e4506dd9"} Mar 18 12:16:31 crc kubenswrapper[4975]: I0318 12:16:31.703428 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:16:31 crc kubenswrapper[4975]: I0318 12:16:31.711815 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:16:55 crc kubenswrapper[4975]: I0318 12:16:55.538759 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:16:55 crc kubenswrapper[4975]: I0318 12:16:55.539489 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:17:25 crc kubenswrapper[4975]: I0318 12:17:25.539196 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:17:25 crc kubenswrapper[4975]: I0318 12:17:25.539848 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.757668 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4c2zb"] Mar 18 12:17:29 crc kubenswrapper[4975]: E0318 12:17:29.758202 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649231fb-eaf6-45ff-9393-3df37c421381" containerName="oc" Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.758215 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="649231fb-eaf6-45ff-9393-3df37c421381" containerName="oc" Mar 18 12:17:29 crc kubenswrapper[4975]: E0318 12:17:29.758236 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.758242 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.758337 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.758354 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="649231fb-eaf6-45ff-9393-3df37c421381" containerName="oc" Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.758781 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.887274 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4c2zb"] Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.944558 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv2ph\" (UniqueName: \"kubernetes.io/projected/c9586043-2b95-4d8b-9213-13fe01a74989-kube-api-access-zv2ph\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.944610 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9586043-2b95-4d8b-9213-13fe01a74989-bound-sa-token\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.944644 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9586043-2b95-4d8b-9213-13fe01a74989-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.944695 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.944738 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9586043-2b95-4d8b-9213-13fe01a74989-trusted-ca\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.944892 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9586043-2b95-4d8b-9213-13fe01a74989-registry-tls\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.945038 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9586043-2b95-4d8b-9213-13fe01a74989-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.945068 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9586043-2b95-4d8b-9213-13fe01a74989-registry-certificates\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:29 crc kubenswrapper[4975]: I0318 12:17:29.966353 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.045995 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9586043-2b95-4d8b-9213-13fe01a74989-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.046062 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9586043-2b95-4d8b-9213-13fe01a74989-registry-certificates\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.046101 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv2ph\" (UniqueName: \"kubernetes.io/projected/c9586043-2b95-4d8b-9213-13fe01a74989-kube-api-access-zv2ph\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.046133 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9586043-2b95-4d8b-9213-13fe01a74989-bound-sa-token\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.046156 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9586043-2b95-4d8b-9213-13fe01a74989-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.046213 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9586043-2b95-4d8b-9213-13fe01a74989-trusted-ca\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.046239 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9586043-2b95-4d8b-9213-13fe01a74989-registry-tls\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.047569 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9586043-2b95-4d8b-9213-13fe01a74989-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.047584 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9586043-2b95-4d8b-9213-13fe01a74989-registry-certificates\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.047695 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9586043-2b95-4d8b-9213-13fe01a74989-trusted-ca\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.053558 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9586043-2b95-4d8b-9213-13fe01a74989-registry-tls\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.054084 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9586043-2b95-4d8b-9213-13fe01a74989-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.062055 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9586043-2b95-4d8b-9213-13fe01a74989-bound-sa-token\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.064038 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv2ph\" (UniqueName: \"kubernetes.io/projected/c9586043-2b95-4d8b-9213-13fe01a74989-kube-api-access-zv2ph\") pod \"image-registry-66df7c8f76-4c2zb\" (UID: \"c9586043-2b95-4d8b-9213-13fe01a74989\") " pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.074628 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:30 crc kubenswrapper[4975]: I0318 12:17:30.542133 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4c2zb"] Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.044896 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" event={"ID":"c9586043-2b95-4d8b-9213-13fe01a74989","Type":"ContainerStarted","Data":"212b9728727fe502d595496f5bdad40015bb8432029d999cf0646790ec91727c"} Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.045257 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.045274 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" event={"ID":"c9586043-2b95-4d8b-9213-13fe01a74989","Type":"ContainerStarted","Data":"35067ad21d09c28badbfd463f5a16059630dd1f0f0077529f1be8fc874138f59"} Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.064850 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f75bb"] Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.065181 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f75bb" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" containerName="registry-server" containerID="cri-o://625745afe2c70e7fbcbeb00971e7c15ae66a9f676e8a646d9caf41d1563d91cc" gracePeriod=2 Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.082237 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" podStartSLOduration=2.082210364 podStartE2EDuration="2.082210364s" podCreationTimestamp="2026-03-18 12:17:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:17:31.077673633 +0000 UTC m=+436.792074222" watchObservedRunningTime="2026-03-18 12:17:31.082210364 +0000 UTC m=+436.796610943" Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.377223 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.466733 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66pqw\" (UniqueName: \"kubernetes.io/projected/46171d59-3549-4843-b6eb-07b9eecd2560-kube-api-access-66pqw\") pod \"46171d59-3549-4843-b6eb-07b9eecd2560\" (UID: \"46171d59-3549-4843-b6eb-07b9eecd2560\") " Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.466896 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46171d59-3549-4843-b6eb-07b9eecd2560-catalog-content\") pod \"46171d59-3549-4843-b6eb-07b9eecd2560\" (UID: \"46171d59-3549-4843-b6eb-07b9eecd2560\") " Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.466968 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46171d59-3549-4843-b6eb-07b9eecd2560-utilities\") pod \"46171d59-3549-4843-b6eb-07b9eecd2560\" (UID: \"46171d59-3549-4843-b6eb-07b9eecd2560\") " Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.467836 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46171d59-3549-4843-b6eb-07b9eecd2560-utilities" (OuterVolumeSpecName: "utilities") pod "46171d59-3549-4843-b6eb-07b9eecd2560" (UID: "46171d59-3549-4843-b6eb-07b9eecd2560"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.478403 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46171d59-3549-4843-b6eb-07b9eecd2560-kube-api-access-66pqw" (OuterVolumeSpecName: "kube-api-access-66pqw") pod "46171d59-3549-4843-b6eb-07b9eecd2560" (UID: "46171d59-3549-4843-b6eb-07b9eecd2560"). InnerVolumeSpecName "kube-api-access-66pqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.494689 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46171d59-3549-4843-b6eb-07b9eecd2560-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46171d59-3549-4843-b6eb-07b9eecd2560" (UID: "46171d59-3549-4843-b6eb-07b9eecd2560"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.567945 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46171d59-3549-4843-b6eb-07b9eecd2560-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.567987 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66pqw\" (UniqueName: \"kubernetes.io/projected/46171d59-3549-4843-b6eb-07b9eecd2560-kube-api-access-66pqw\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:31 crc kubenswrapper[4975]: I0318 12:17:31.567998 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46171d59-3549-4843-b6eb-07b9eecd2560-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:32 crc kubenswrapper[4975]: I0318 12:17:32.051985 4975 generic.go:334] "Generic (PLEG): container finished" podID="46171d59-3549-4843-b6eb-07b9eecd2560" containerID="625745afe2c70e7fbcbeb00971e7c15ae66a9f676e8a646d9caf41d1563d91cc" exitCode=0 Mar 18 12:17:32 crc kubenswrapper[4975]: I0318 12:17:32.053118 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f75bb" Mar 18 12:17:32 crc kubenswrapper[4975]: I0318 12:17:32.053967 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f75bb" event={"ID":"46171d59-3549-4843-b6eb-07b9eecd2560","Type":"ContainerDied","Data":"625745afe2c70e7fbcbeb00971e7c15ae66a9f676e8a646d9caf41d1563d91cc"} Mar 18 12:17:32 crc kubenswrapper[4975]: I0318 12:17:32.054028 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f75bb" event={"ID":"46171d59-3549-4843-b6eb-07b9eecd2560","Type":"ContainerDied","Data":"4f447c6a072ea8793c553ead6eff49fcccb74d794cdc5d019bd35d7d6d90027c"} Mar 18 12:17:32 crc kubenswrapper[4975]: I0318 12:17:32.054046 4975 scope.go:117] "RemoveContainer" containerID="625745afe2c70e7fbcbeb00971e7c15ae66a9f676e8a646d9caf41d1563d91cc" Mar 18 12:17:32 crc kubenswrapper[4975]: I0318 12:17:32.072525 4975 scope.go:117] "RemoveContainer" containerID="b890c6e23112f402b556e7d927e2abe518eaf032d4e2de0a0fdbc4168240ce98" Mar 18 12:17:32 crc kubenswrapper[4975]: I0318 12:17:32.092383 4975 scope.go:117] "RemoveContainer" containerID="bb8e2327d4e9a55195ce477ceb4b071a32bcdbf68f7f687fa160837f468b5e0f" Mar 18 12:17:32 crc kubenswrapper[4975]: I0318 12:17:32.094605 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f75bb"] Mar 18 12:17:32 crc kubenswrapper[4975]: I0318 12:17:32.098599 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f75bb"] Mar 18 12:17:32 crc kubenswrapper[4975]: I0318 12:17:32.107293 4975 scope.go:117] "RemoveContainer" containerID="625745afe2c70e7fbcbeb00971e7c15ae66a9f676e8a646d9caf41d1563d91cc" Mar 18 12:17:32 crc kubenswrapper[4975]: E0318 12:17:32.107682 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625745afe2c70e7fbcbeb00971e7c15ae66a9f676e8a646d9caf41d1563d91cc\": container with ID starting with 625745afe2c70e7fbcbeb00971e7c15ae66a9f676e8a646d9caf41d1563d91cc not found: ID does not exist" containerID="625745afe2c70e7fbcbeb00971e7c15ae66a9f676e8a646d9caf41d1563d91cc" Mar 18 12:17:32 crc kubenswrapper[4975]: I0318 12:17:32.107744 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625745afe2c70e7fbcbeb00971e7c15ae66a9f676e8a646d9caf41d1563d91cc"} err="failed to get container status \"625745afe2c70e7fbcbeb00971e7c15ae66a9f676e8a646d9caf41d1563d91cc\": rpc error: code = NotFound desc = could not find container \"625745afe2c70e7fbcbeb00971e7c15ae66a9f676e8a646d9caf41d1563d91cc\": container with ID starting with 625745afe2c70e7fbcbeb00971e7c15ae66a9f676e8a646d9caf41d1563d91cc not found: ID does not exist" Mar 18 12:17:32 crc kubenswrapper[4975]: I0318 12:17:32.107772 4975 scope.go:117] "RemoveContainer" containerID="b890c6e23112f402b556e7d927e2abe518eaf032d4e2de0a0fdbc4168240ce98" Mar 18 12:17:32 crc kubenswrapper[4975]: E0318 12:17:32.108194 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b890c6e23112f402b556e7d927e2abe518eaf032d4e2de0a0fdbc4168240ce98\": container with ID starting with b890c6e23112f402b556e7d927e2abe518eaf032d4e2de0a0fdbc4168240ce98 not found: ID does not exist" containerID="b890c6e23112f402b556e7d927e2abe518eaf032d4e2de0a0fdbc4168240ce98" Mar 18 12:17:32 crc kubenswrapper[4975]: I0318 12:17:32.108232 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b890c6e23112f402b556e7d927e2abe518eaf032d4e2de0a0fdbc4168240ce98"} err="failed to get container status \"b890c6e23112f402b556e7d927e2abe518eaf032d4e2de0a0fdbc4168240ce98\": rpc error: code = NotFound desc = could not find container \"b890c6e23112f402b556e7d927e2abe518eaf032d4e2de0a0fdbc4168240ce98\": container with ID starting with b890c6e23112f402b556e7d927e2abe518eaf032d4e2de0a0fdbc4168240ce98 not found: ID does not exist" Mar 18 12:17:32 crc kubenswrapper[4975]: I0318 12:17:32.108270 4975 scope.go:117] "RemoveContainer" containerID="bb8e2327d4e9a55195ce477ceb4b071a32bcdbf68f7f687fa160837f468b5e0f" Mar 18 12:17:32 crc kubenswrapper[4975]: E0318 12:17:32.108596 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb8e2327d4e9a55195ce477ceb4b071a32bcdbf68f7f687fa160837f468b5e0f\": container with ID starting with bb8e2327d4e9a55195ce477ceb4b071a32bcdbf68f7f687fa160837f468b5e0f not found: ID does not exist" containerID="bb8e2327d4e9a55195ce477ceb4b071a32bcdbf68f7f687fa160837f468b5e0f" Mar 18 12:17:32 crc kubenswrapper[4975]: I0318 12:17:32.108620 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8e2327d4e9a55195ce477ceb4b071a32bcdbf68f7f687fa160837f468b5e0f"} err="failed to get container status \"bb8e2327d4e9a55195ce477ceb4b071a32bcdbf68f7f687fa160837f468b5e0f\": rpc error: code = NotFound desc = could not find container \"bb8e2327d4e9a55195ce477ceb4b071a32bcdbf68f7f687fa160837f468b5e0f\": container with ID starting with bb8e2327d4e9a55195ce477ceb4b071a32bcdbf68f7f687fa160837f468b5e0f not found: ID does not exist" Mar 18 12:17:33 crc kubenswrapper[4975]: I0318 12:17:33.023015 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" path="/var/lib/kubelet/pods/46171d59-3549-4843-b6eb-07b9eecd2560/volumes" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.561245 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hprxg"] Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.562689 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hprxg" podUID="a7a76930-86ba-4055-85e0-6053832da1aa" containerName="registry-server" containerID="cri-o://9995283fb4e2f5f55b62e97adff14daa9e743b017358e07549a8bd2e6ab91a84" gracePeriod=30 Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.578258 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xnfcw"] Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.579105 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xnfcw" podUID="3b24c4ea-1b55-429c-97f5-376523ea1a52" containerName="registry-server" containerID="cri-o://78e09328c9f62714bfcb9b955585ff4ba30be3e8bed3618b80b08618937b9c3e" gracePeriod=30 Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.588673 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q2bgt"] Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.588907 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" podUID="d989095d-7ce2-4dd7-ac9e-5c747e900a61" containerName="marketplace-operator" containerID="cri-o://ef6a0860d7ed55f90458e047ef4d9682a5a26de6b578d07e92ea3e43e4506dd9" gracePeriod=30 Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.603913 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-whr68"] Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.604167 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-whr68" podUID="21b9dc77-7653-4684-ba67-cece256c42e2" containerName="registry-server" containerID="cri-o://5dbf64d104d282f2010bd4d5c1d6483884c084c09ed27b2cb7589c1b194fccc1" gracePeriod=30 Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.608886 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dswgh"] Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.609306 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5s8vm"] Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.609443 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dswgh" podUID="698cd02e-0279-4ae7-be21-bd479b2dfe49" containerName="registry-server" containerID="cri-o://1a6b486fc13cc47731c997aa3a78910242ce76adbc3add70cf76274cb96308d0" gracePeriod=30 Mar 18 12:17:47 crc kubenswrapper[4975]: E0318 12:17:47.609497 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" containerName="extract-content" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.609612 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" containerName="extract-content" Mar 18 12:17:47 crc kubenswrapper[4975]: E0318 12:17:47.609622 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" containerName="registry-server" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.609628 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" containerName="registry-server" Mar 18 12:17:47 crc kubenswrapper[4975]: E0318 12:17:47.609645 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" containerName="extract-utilities" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.609653 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" containerName="extract-utilities" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.609741 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="46171d59-3549-4843-b6eb-07b9eecd2560" containerName="registry-server" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.610217 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.630210 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5s8vm"] Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.688663 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a1dafc0-c705-4110-b5f5-d622a2097f64-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5s8vm\" (UID: \"1a1dafc0-c705-4110-b5f5-d622a2097f64\") " pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.688763 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1a1dafc0-c705-4110-b5f5-d622a2097f64-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5s8vm\" (UID: \"1a1dafc0-c705-4110-b5f5-d622a2097f64\") " pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.688786 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sxhx\" (UniqueName: \"kubernetes.io/projected/1a1dafc0-c705-4110-b5f5-d622a2097f64-kube-api-access-9sxhx\") pod \"marketplace-operator-79b997595-5s8vm\" (UID: \"1a1dafc0-c705-4110-b5f5-d622a2097f64\") " pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.789858 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1a1dafc0-c705-4110-b5f5-d622a2097f64-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5s8vm\" (UID: \"1a1dafc0-c705-4110-b5f5-d622a2097f64\") " pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.790201 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sxhx\" (UniqueName: \"kubernetes.io/projected/1a1dafc0-c705-4110-b5f5-d622a2097f64-kube-api-access-9sxhx\") pod \"marketplace-operator-79b997595-5s8vm\" (UID: \"1a1dafc0-c705-4110-b5f5-d622a2097f64\") " pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.790337 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a1dafc0-c705-4110-b5f5-d622a2097f64-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5s8vm\" (UID: \"1a1dafc0-c705-4110-b5f5-d622a2097f64\") " pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.791327 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a1dafc0-c705-4110-b5f5-d622a2097f64-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5s8vm\" (UID: \"1a1dafc0-c705-4110-b5f5-d622a2097f64\") " pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.795677 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1a1dafc0-c705-4110-b5f5-d622a2097f64-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5s8vm\" (UID: \"1a1dafc0-c705-4110-b5f5-d622a2097f64\") " pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.807524 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sxhx\" (UniqueName: \"kubernetes.io/projected/1a1dafc0-c705-4110-b5f5-d622a2097f64-kube-api-access-9sxhx\") pod \"marketplace-operator-79b997595-5s8vm\" (UID: \"1a1dafc0-c705-4110-b5f5-d622a2097f64\") " pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" Mar 18 12:17:47 crc kubenswrapper[4975]: I0318 12:17:47.929704 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.111772 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5s8vm"] Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.128886 4975 generic.go:334] "Generic (PLEG): container finished" podID="a7a76930-86ba-4055-85e0-6053832da1aa" containerID="9995283fb4e2f5f55b62e97adff14daa9e743b017358e07549a8bd2e6ab91a84" exitCode=0 Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.128961 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hprxg" event={"ID":"a7a76930-86ba-4055-85e0-6053832da1aa","Type":"ContainerDied","Data":"9995283fb4e2f5f55b62e97adff14daa9e743b017358e07549a8bd2e6ab91a84"} Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.130359 4975 generic.go:334] "Generic (PLEG): container finished" podID="d989095d-7ce2-4dd7-ac9e-5c747e900a61" containerID="ef6a0860d7ed55f90458e047ef4d9682a5a26de6b578d07e92ea3e43e4506dd9" exitCode=0 Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.130409 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" event={"ID":"d989095d-7ce2-4dd7-ac9e-5c747e900a61","Type":"ContainerDied","Data":"ef6a0860d7ed55f90458e047ef4d9682a5a26de6b578d07e92ea3e43e4506dd9"} Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.130438 4975 scope.go:117] "RemoveContainer" containerID="3f2f31a4621d02dcf84343d3adb32877b55252d630db0a176c7cdfad689d74a1" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.133385 4975 generic.go:334] "Generic (PLEG): container finished" podID="698cd02e-0279-4ae7-be21-bd479b2dfe49" containerID="1a6b486fc13cc47731c997aa3a78910242ce76adbc3add70cf76274cb96308d0" exitCode=0 Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.133483 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dswgh" event={"ID":"698cd02e-0279-4ae7-be21-bd479b2dfe49","Type":"ContainerDied","Data":"1a6b486fc13cc47731c997aa3a78910242ce76adbc3add70cf76274cb96308d0"} Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.135465 4975 generic.go:334] "Generic (PLEG): container finished" podID="21b9dc77-7653-4684-ba67-cece256c42e2" containerID="5dbf64d104d282f2010bd4d5c1d6483884c084c09ed27b2cb7589c1b194fccc1" exitCode=0 Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.135513 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whr68" event={"ID":"21b9dc77-7653-4684-ba67-cece256c42e2","Type":"ContainerDied","Data":"5dbf64d104d282f2010bd4d5c1d6483884c084c09ed27b2cb7589c1b194fccc1"} Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.137230 4975 generic.go:334] "Generic (PLEG): container finished" podID="3b24c4ea-1b55-429c-97f5-376523ea1a52" containerID="78e09328c9f62714bfcb9b955585ff4ba30be3e8bed3618b80b08618937b9c3e" exitCode=0 Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.137254 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnfcw" event={"ID":"3b24c4ea-1b55-429c-97f5-376523ea1a52","Type":"ContainerDied","Data":"78e09328c9f62714bfcb9b955585ff4ba30be3e8bed3618b80b08618937b9c3e"} Mar 18 12:17:48 crc kubenswrapper[4975]: W0318 12:17:48.184231 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a1dafc0_c705_4110_b5f5_d622a2097f64.slice/crio-1413d5a076f786ccac6c3122252a5b9cbd3f5652228b4d40347e9dcc6ae08fce WatchSource:0}: Error finding container 1413d5a076f786ccac6c3122252a5b9cbd3f5652228b4d40347e9dcc6ae08fce: Status 404 returned error can't find the container with id 1413d5a076f786ccac6c3122252a5b9cbd3f5652228b4d40347e9dcc6ae08fce Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.448339 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.509463 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.603117 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b24c4ea-1b55-429c-97f5-376523ea1a52-utilities\") pod \"3b24c4ea-1b55-429c-97f5-376523ea1a52\" (UID: \"3b24c4ea-1b55-429c-97f5-376523ea1a52\") " Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.603254 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk82v\" (UniqueName: \"kubernetes.io/projected/a7a76930-86ba-4055-85e0-6053832da1aa-kube-api-access-bk82v\") pod \"a7a76930-86ba-4055-85e0-6053832da1aa\" (UID: \"a7a76930-86ba-4055-85e0-6053832da1aa\") " Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.603290 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sk2c\" (UniqueName: \"kubernetes.io/projected/3b24c4ea-1b55-429c-97f5-376523ea1a52-kube-api-access-6sk2c\") pod \"3b24c4ea-1b55-429c-97f5-376523ea1a52\" (UID: \"3b24c4ea-1b55-429c-97f5-376523ea1a52\") " Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.603312 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7a76930-86ba-4055-85e0-6053832da1aa-catalog-content\") pod \"a7a76930-86ba-4055-85e0-6053832da1aa\" (UID: \"a7a76930-86ba-4055-85e0-6053832da1aa\") " Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.603330 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b24c4ea-1b55-429c-97f5-376523ea1a52-catalog-content\") pod \"3b24c4ea-1b55-429c-97f5-376523ea1a52\" (UID: \"3b24c4ea-1b55-429c-97f5-376523ea1a52\") " Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.603483 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7a76930-86ba-4055-85e0-6053832da1aa-utilities\") pod \"a7a76930-86ba-4055-85e0-6053832da1aa\" (UID: \"a7a76930-86ba-4055-85e0-6053832da1aa\") " Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.604137 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b24c4ea-1b55-429c-97f5-376523ea1a52-utilities" (OuterVolumeSpecName: "utilities") pod "3b24c4ea-1b55-429c-97f5-376523ea1a52" (UID: "3b24c4ea-1b55-429c-97f5-376523ea1a52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.604791 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7a76930-86ba-4055-85e0-6053832da1aa-utilities" (OuterVolumeSpecName: "utilities") pod "a7a76930-86ba-4055-85e0-6053832da1aa" (UID: "a7a76930-86ba-4055-85e0-6053832da1aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.610700 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7a76930-86ba-4055-85e0-6053832da1aa-kube-api-access-bk82v" (OuterVolumeSpecName: "kube-api-access-bk82v") pod "a7a76930-86ba-4055-85e0-6053832da1aa" (UID: "a7a76930-86ba-4055-85e0-6053832da1aa"). InnerVolumeSpecName "kube-api-access-bk82v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.611647 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.615138 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b24c4ea-1b55-429c-97f5-376523ea1a52-kube-api-access-6sk2c" (OuterVolumeSpecName: "kube-api-access-6sk2c") pod "3b24c4ea-1b55-429c-97f5-376523ea1a52" (UID: "3b24c4ea-1b55-429c-97f5-376523ea1a52"). InnerVolumeSpecName "kube-api-access-6sk2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.621382 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.638907 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.681034 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b24c4ea-1b55-429c-97f5-376523ea1a52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b24c4ea-1b55-429c-97f5-376523ea1a52" (UID: "3b24c4ea-1b55-429c-97f5-376523ea1a52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.682875 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7a76930-86ba-4055-85e0-6053832da1aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7a76930-86ba-4055-85e0-6053832da1aa" (UID: "a7a76930-86ba-4055-85e0-6053832da1aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.705172 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d989095d-7ce2-4dd7-ac9e-5c747e900a61-marketplace-operator-metrics\") pod \"d989095d-7ce2-4dd7-ac9e-5c747e900a61\" (UID: \"d989095d-7ce2-4dd7-ac9e-5c747e900a61\") " Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.705345 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b9dc77-7653-4684-ba67-cece256c42e2-utilities\") pod \"21b9dc77-7653-4684-ba67-cece256c42e2\" (UID: \"21b9dc77-7653-4684-ba67-cece256c42e2\") " Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.705538 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d989095d-7ce2-4dd7-ac9e-5c747e900a61-marketplace-trusted-ca\") pod \"d989095d-7ce2-4dd7-ac9e-5c747e900a61\" (UID: \"d989095d-7ce2-4dd7-ac9e-5c747e900a61\") " Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.705856 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffcng\" (UniqueName: \"kubernetes.io/projected/d989095d-7ce2-4dd7-ac9e-5c747e900a61-kube-api-access-ffcng\") pod \"d989095d-7ce2-4dd7-ac9e-5c747e900a61\" (UID: \"d989095d-7ce2-4dd7-ac9e-5c747e900a61\") " Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.705935 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b9dc77-7653-4684-ba67-cece256c42e2-catalog-content\") pod \"21b9dc77-7653-4684-ba67-cece256c42e2\" (UID: \"21b9dc77-7653-4684-ba67-cece256c42e2\") " Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.705995 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hpnd\" (UniqueName: \"kubernetes.io/projected/21b9dc77-7653-4684-ba67-cece256c42e2-kube-api-access-8hpnd\") pod \"21b9dc77-7653-4684-ba67-cece256c42e2\" (UID: \"21b9dc77-7653-4684-ba67-cece256c42e2\") " Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.706511 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d989095d-7ce2-4dd7-ac9e-5c747e900a61-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d989095d-7ce2-4dd7-ac9e-5c747e900a61" (UID: "d989095d-7ce2-4dd7-ac9e-5c747e900a61"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.706653 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21b9dc77-7653-4684-ba67-cece256c42e2-utilities" (OuterVolumeSpecName: "utilities") pod "21b9dc77-7653-4684-ba67-cece256c42e2" (UID: "21b9dc77-7653-4684-ba67-cece256c42e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.707210 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7a76930-86ba-4055-85e0-6053832da1aa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.707230 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b24c4ea-1b55-429c-97f5-376523ea1a52-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.707242 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sk2c\" (UniqueName: \"kubernetes.io/projected/3b24c4ea-1b55-429c-97f5-376523ea1a52-kube-api-access-6sk2c\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.707256 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7a76930-86ba-4055-85e0-6053832da1aa-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.707268 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b9dc77-7653-4684-ba67-cece256c42e2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.707278 4975 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d989095d-7ce2-4dd7-ac9e-5c747e900a61-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.707289 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b24c4ea-1b55-429c-97f5-376523ea1a52-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.707300 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk82v\" (UniqueName: \"kubernetes.io/projected/a7a76930-86ba-4055-85e0-6053832da1aa-kube-api-access-bk82v\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.713518 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d989095d-7ce2-4dd7-ac9e-5c747e900a61-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d989095d-7ce2-4dd7-ac9e-5c747e900a61" (UID: "d989095d-7ce2-4dd7-ac9e-5c747e900a61"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.713814 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d989095d-7ce2-4dd7-ac9e-5c747e900a61-kube-api-access-ffcng" (OuterVolumeSpecName: "kube-api-access-ffcng") pod "d989095d-7ce2-4dd7-ac9e-5c747e900a61" (UID: "d989095d-7ce2-4dd7-ac9e-5c747e900a61"). InnerVolumeSpecName "kube-api-access-ffcng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.715070 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b9dc77-7653-4684-ba67-cece256c42e2-kube-api-access-8hpnd" (OuterVolumeSpecName: "kube-api-access-8hpnd") pod "21b9dc77-7653-4684-ba67-cece256c42e2" (UID: "21b9dc77-7653-4684-ba67-cece256c42e2"). InnerVolumeSpecName "kube-api-access-8hpnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.752931 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21b9dc77-7653-4684-ba67-cece256c42e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21b9dc77-7653-4684-ba67-cece256c42e2" (UID: "21b9dc77-7653-4684-ba67-cece256c42e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.808562 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698cd02e-0279-4ae7-be21-bd479b2dfe49-catalog-content\") pod \"698cd02e-0279-4ae7-be21-bd479b2dfe49\" (UID: \"698cd02e-0279-4ae7-be21-bd479b2dfe49\") " Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.808630 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db4fl\" (UniqueName: \"kubernetes.io/projected/698cd02e-0279-4ae7-be21-bd479b2dfe49-kube-api-access-db4fl\") pod \"698cd02e-0279-4ae7-be21-bd479b2dfe49\" (UID: \"698cd02e-0279-4ae7-be21-bd479b2dfe49\") " Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.808657 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698cd02e-0279-4ae7-be21-bd479b2dfe49-utilities\") pod \"698cd02e-0279-4ae7-be21-bd479b2dfe49\" (UID: \"698cd02e-0279-4ae7-be21-bd479b2dfe49\") " Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.808996 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffcng\" (UniqueName: \"kubernetes.io/projected/d989095d-7ce2-4dd7-ac9e-5c747e900a61-kube-api-access-ffcng\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.809023 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b9dc77-7653-4684-ba67-cece256c42e2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.809035 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hpnd\" (UniqueName: \"kubernetes.io/projected/21b9dc77-7653-4684-ba67-cece256c42e2-kube-api-access-8hpnd\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.809048 4975 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d989095d-7ce2-4dd7-ac9e-5c747e900a61-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.809551 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/698cd02e-0279-4ae7-be21-bd479b2dfe49-utilities" (OuterVolumeSpecName: "utilities") pod "698cd02e-0279-4ae7-be21-bd479b2dfe49" (UID: "698cd02e-0279-4ae7-be21-bd479b2dfe49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.814010 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698cd02e-0279-4ae7-be21-bd479b2dfe49-kube-api-access-db4fl" (OuterVolumeSpecName: "kube-api-access-db4fl") pod "698cd02e-0279-4ae7-be21-bd479b2dfe49" (UID: "698cd02e-0279-4ae7-be21-bd479b2dfe49"). InnerVolumeSpecName "kube-api-access-db4fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.910194 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db4fl\" (UniqueName: \"kubernetes.io/projected/698cd02e-0279-4ae7-be21-bd479b2dfe49-kube-api-access-db4fl\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.910234 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/698cd02e-0279-4ae7-be21-bd479b2dfe49-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:48 crc kubenswrapper[4975]: I0318 12:17:48.923733 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/698cd02e-0279-4ae7-be21-bd479b2dfe49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "698cd02e-0279-4ae7-be21-bd479b2dfe49" (UID: "698cd02e-0279-4ae7-be21-bd479b2dfe49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.011261 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/698cd02e-0279-4ae7-be21-bd479b2dfe49-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.143905 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whr68" event={"ID":"21b9dc77-7653-4684-ba67-cece256c42e2","Type":"ContainerDied","Data":"c162369492277f2e0c64f8c6d6ef95fa0e4913e4286ac519e8b34dde287e44a0"} Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.143964 4975 scope.go:117] "RemoveContainer" containerID="5dbf64d104d282f2010bd4d5c1d6483884c084c09ed27b2cb7589c1b194fccc1" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.143973 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whr68" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.146288 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnfcw" event={"ID":"3b24c4ea-1b55-429c-97f5-376523ea1a52","Type":"ContainerDied","Data":"42130e79785555e3233872cfc95b694c4787bcc19e031079f15b0f4f9835a4be"} Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.146404 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnfcw" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.148886 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" event={"ID":"1a1dafc0-c705-4110-b5f5-d622a2097f64","Type":"ContainerStarted","Data":"fb370057f93d81ed38d884ace9e48bf6f1bfdb66d8924ad0ea7ed113174c6c5d"} Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.148919 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" event={"ID":"1a1dafc0-c705-4110-b5f5-d622a2097f64","Type":"ContainerStarted","Data":"1413d5a076f786ccac6c3122252a5b9cbd3f5652228b4d40347e9dcc6ae08fce"} Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.149200 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.150969 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hprxg" event={"ID":"a7a76930-86ba-4055-85e0-6053832da1aa","Type":"ContainerDied","Data":"5da695589575d82110ad0e83ba18f6b3798510831f8a300f3ba9184cab96f7c6"} Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.151058 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hprxg" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.153657 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.155259 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" event={"ID":"d989095d-7ce2-4dd7-ac9e-5c747e900a61","Type":"ContainerDied","Data":"4d91673f2376af26ae7a909db29444498626cda1fe6e2147e7659906725c9f9a"} Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.155385 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q2bgt" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.158293 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dswgh" event={"ID":"698cd02e-0279-4ae7-be21-bd479b2dfe49","Type":"ContainerDied","Data":"c37625ee2b3a332206d1d947dfd1158396412d520d3a9e7b4ed769228168e99a"} Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.158333 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dswgh" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.161595 4975 scope.go:117] "RemoveContainer" containerID="16a790b0f67e55d709c51fbff50242a8113fadf1d483388a139c51971f249423" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.171553 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-5s8vm" podStartSLOduration=2.171534834 podStartE2EDuration="2.171534834s" podCreationTimestamp="2026-03-18 12:17:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:17:49.168944145 +0000 UTC m=+454.883344744" watchObservedRunningTime="2026-03-18 12:17:49.171534834 +0000 UTC m=+454.885935413" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.184016 4975 scope.go:117] "RemoveContainer" containerID="cbcc2ffbdc5b00929d64ffab41e848866ccf45f9c53a109590a05be3aacb052e" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.187983 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-whr68"] Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.193281 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-whr68"] Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.200661 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hprxg"] Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.204067 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hprxg"] Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.205246 4975 scope.go:117] "RemoveContainer" containerID="78e09328c9f62714bfcb9b955585ff4ba30be3e8bed3618b80b08618937b9c3e" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.212144 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xnfcw"] Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.219378 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xnfcw"] Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.223348 4975 scope.go:117] "RemoveContainer" containerID="70a24895c4068290f45c505e21a1f9e819b9750e1a3cd6c3b5f21979538a89b8" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.242923 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q2bgt"] Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.251952 4975 scope.go:117] "RemoveContainer" containerID="2225529e0d8c02842364d0752e7aed1b13ebb30a58e16aa9d261b637a2fa1aae" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.253257 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q2bgt"] Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.258565 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dswgh"] Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.266308 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dswgh"] Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.272478 4975 scope.go:117] "RemoveContainer" containerID="9995283fb4e2f5f55b62e97adff14daa9e743b017358e07549a8bd2e6ab91a84" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.297067 4975 scope.go:117] "RemoveContainer" containerID="59c936ff2e20ac4a1736da8a313066426e7ebc240edcfc1d4f63bca6927f1cfd" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.310105 4975 scope.go:117] "RemoveContainer" containerID="6ea0e79bd7cb8286af22f95f08a33d7cde8ae8b2305782d45327129030874fbe" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.323289 4975 scope.go:117] "RemoveContainer" containerID="ef6a0860d7ed55f90458e047ef4d9682a5a26de6b578d07e92ea3e43e4506dd9" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.334280 4975 scope.go:117] "RemoveContainer" containerID="1a6b486fc13cc47731c997aa3a78910242ce76adbc3add70cf76274cb96308d0" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.347303 4975 scope.go:117] "RemoveContainer" containerID="65739e4ceb2b7b61050d376d6ee277b0f2692f1d7f769c118219e02df86c2127" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.361180 4975 scope.go:117] "RemoveContainer" containerID="66882b365489661784acf85a1b572a55c82cb741d29067dc33e6e382120722a8" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.776376 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bcsvd"] Mar 18 12:17:49 crc kubenswrapper[4975]: E0318 12:17:49.776619 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a76930-86ba-4055-85e0-6053832da1aa" containerName="extract-utilities" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.776634 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a76930-86ba-4055-85e0-6053832da1aa" containerName="extract-utilities" Mar 18 12:17:49 crc kubenswrapper[4975]: E0318 12:17:49.776651 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698cd02e-0279-4ae7-be21-bd479b2dfe49" containerName="registry-server" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.776658 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="698cd02e-0279-4ae7-be21-bd479b2dfe49" containerName="registry-server" Mar 18 12:17:49 crc kubenswrapper[4975]: E0318 12:17:49.776667 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698cd02e-0279-4ae7-be21-bd479b2dfe49" containerName="extract-utilities" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.776674 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="698cd02e-0279-4ae7-be21-bd479b2dfe49" containerName="extract-utilities" Mar 18 12:17:49 crc kubenswrapper[4975]: E0318 12:17:49.776685 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b24c4ea-1b55-429c-97f5-376523ea1a52" containerName="extract-content" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.776691 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b24c4ea-1b55-429c-97f5-376523ea1a52" containerName="extract-content" Mar 18 12:17:49 crc kubenswrapper[4975]: E0318 12:17:49.776700 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a76930-86ba-4055-85e0-6053832da1aa" containerName="extract-content" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.776707 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a76930-86ba-4055-85e0-6053832da1aa" containerName="extract-content" Mar 18 12:17:49 crc kubenswrapper[4975]: E0318 12:17:49.776718 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d989095d-7ce2-4dd7-ac9e-5c747e900a61" containerName="marketplace-operator" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.776727 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d989095d-7ce2-4dd7-ac9e-5c747e900a61" containerName="marketplace-operator" Mar 18 12:17:49 crc kubenswrapper[4975]: E0318 12:17:49.776737 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b24c4ea-1b55-429c-97f5-376523ea1a52" containerName="extract-utilities" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.776744 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b24c4ea-1b55-429c-97f5-376523ea1a52" containerName="extract-utilities" Mar 18 12:17:49 crc kubenswrapper[4975]: E0318 12:17:49.776757 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d989095d-7ce2-4dd7-ac9e-5c747e900a61" containerName="marketplace-operator" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.776764 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d989095d-7ce2-4dd7-ac9e-5c747e900a61" containerName="marketplace-operator" Mar 18 12:17:49 crc kubenswrapper[4975]: E0318 12:17:49.776773 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b9dc77-7653-4684-ba67-cece256c42e2" containerName="registry-server" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.776781 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b9dc77-7653-4684-ba67-cece256c42e2" containerName="registry-server" Mar 18 12:17:49 crc kubenswrapper[4975]: E0318 12:17:49.776793 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a76930-86ba-4055-85e0-6053832da1aa" containerName="registry-server" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.776800 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a76930-86ba-4055-85e0-6053832da1aa" containerName="registry-server" Mar 18 12:17:49 crc kubenswrapper[4975]: E0318 12:17:49.776810 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b9dc77-7653-4684-ba67-cece256c42e2" containerName="extract-content" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.776817 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b9dc77-7653-4684-ba67-cece256c42e2" containerName="extract-content" Mar 18 12:17:49 crc kubenswrapper[4975]: E0318 12:17:49.776828 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698cd02e-0279-4ae7-be21-bd479b2dfe49" containerName="extract-content" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.776837 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="698cd02e-0279-4ae7-be21-bd479b2dfe49" containerName="extract-content" Mar 18 12:17:49 crc kubenswrapper[4975]: E0318 12:17:49.776849 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b9dc77-7653-4684-ba67-cece256c42e2" containerName="extract-utilities" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.776857 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b9dc77-7653-4684-ba67-cece256c42e2" containerName="extract-utilities" Mar 18 12:17:49 crc kubenswrapper[4975]: E0318 12:17:49.776885 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b24c4ea-1b55-429c-97f5-376523ea1a52" containerName="registry-server" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.776895 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b24c4ea-1b55-429c-97f5-376523ea1a52" containerName="registry-server" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.777003 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="d989095d-7ce2-4dd7-ac9e-5c747e900a61" containerName="marketplace-operator" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.777016 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="698cd02e-0279-4ae7-be21-bd479b2dfe49" containerName="registry-server" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.777027 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b24c4ea-1b55-429c-97f5-376523ea1a52" containerName="registry-server" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.777045 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7a76930-86ba-4055-85e0-6053832da1aa" containerName="registry-server" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.777054 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b9dc77-7653-4684-ba67-cece256c42e2" containerName="registry-server" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.777222 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="d989095d-7ce2-4dd7-ac9e-5c747e900a61" containerName="marketplace-operator" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.777820 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bcsvd" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.780633 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.791493 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bcsvd"] Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.921087 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs6fg\" (UniqueName: \"kubernetes.io/projected/c58752ca-f22d-49b1-ac3c-f3cafb7c26e0-kube-api-access-qs6fg\") pod \"certified-operators-bcsvd\" (UID: \"c58752ca-f22d-49b1-ac3c-f3cafb7c26e0\") " pod="openshift-marketplace/certified-operators-bcsvd" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.921353 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c58752ca-f22d-49b1-ac3c-f3cafb7c26e0-utilities\") pod \"certified-operators-bcsvd\" (UID: \"c58752ca-f22d-49b1-ac3c-f3cafb7c26e0\") " pod="openshift-marketplace/certified-operators-bcsvd" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.921434 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c58752ca-f22d-49b1-ac3c-f3cafb7c26e0-catalog-content\") pod \"certified-operators-bcsvd\" (UID: \"c58752ca-f22d-49b1-ac3c-f3cafb7c26e0\") " pod="openshift-marketplace/certified-operators-bcsvd" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.973661 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l8xlg"] Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.974766 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8xlg" Mar 18 12:17:49 crc kubenswrapper[4975]: I0318 12:17:49.976723 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.023321 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs6fg\" (UniqueName: \"kubernetes.io/projected/c58752ca-f22d-49b1-ac3c-f3cafb7c26e0-kube-api-access-qs6fg\") pod \"certified-operators-bcsvd\" (UID: \"c58752ca-f22d-49b1-ac3c-f3cafb7c26e0\") " pod="openshift-marketplace/certified-operators-bcsvd" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.023380 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c58752ca-f22d-49b1-ac3c-f3cafb7c26e0-utilities\") pod \"certified-operators-bcsvd\" (UID: \"c58752ca-f22d-49b1-ac3c-f3cafb7c26e0\") " pod="openshift-marketplace/certified-operators-bcsvd" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.023408 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c58752ca-f22d-49b1-ac3c-f3cafb7c26e0-catalog-content\") pod \"certified-operators-bcsvd\" (UID: \"c58752ca-f22d-49b1-ac3c-f3cafb7c26e0\") " pod="openshift-marketplace/certified-operators-bcsvd" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.024003 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c58752ca-f22d-49b1-ac3c-f3cafb7c26e0-catalog-content\") pod \"certified-operators-bcsvd\" (UID: \"c58752ca-f22d-49b1-ac3c-f3cafb7c26e0\") " pod="openshift-marketplace/certified-operators-bcsvd" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.024729 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c58752ca-f22d-49b1-ac3c-f3cafb7c26e0-utilities\") pod \"certified-operators-bcsvd\" (UID: \"c58752ca-f22d-49b1-ac3c-f3cafb7c26e0\") " pod="openshift-marketplace/certified-operators-bcsvd" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.036823 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8xlg"] Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.048804 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs6fg\" (UniqueName: \"kubernetes.io/projected/c58752ca-f22d-49b1-ac3c-f3cafb7c26e0-kube-api-access-qs6fg\") pod \"certified-operators-bcsvd\" (UID: \"c58752ca-f22d-49b1-ac3c-f3cafb7c26e0\") " pod="openshift-marketplace/certified-operators-bcsvd" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.081901 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-4c2zb" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.097017 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bcsvd" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.141748 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e448c251-6293-491b-8d15-cd0ebd53d468-catalog-content\") pod \"redhat-marketplace-l8xlg\" (UID: \"e448c251-6293-491b-8d15-cd0ebd53d468\") " pod="openshift-marketplace/redhat-marketplace-l8xlg" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.141877 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e448c251-6293-491b-8d15-cd0ebd53d468-utilities\") pod \"redhat-marketplace-l8xlg\" (UID: \"e448c251-6293-491b-8d15-cd0ebd53d468\") " pod="openshift-marketplace/redhat-marketplace-l8xlg" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.141914 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k2zp\" (UniqueName: \"kubernetes.io/projected/e448c251-6293-491b-8d15-cd0ebd53d468-kube-api-access-5k2zp\") pod \"redhat-marketplace-l8xlg\" (UID: \"e448c251-6293-491b-8d15-cd0ebd53d468\") " pod="openshift-marketplace/redhat-marketplace-l8xlg" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.146955 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8nsht"] Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.243262 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e448c251-6293-491b-8d15-cd0ebd53d468-utilities\") pod \"redhat-marketplace-l8xlg\" (UID: \"e448c251-6293-491b-8d15-cd0ebd53d468\") " pod="openshift-marketplace/redhat-marketplace-l8xlg" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.243302 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k2zp\" (UniqueName: \"kubernetes.io/projected/e448c251-6293-491b-8d15-cd0ebd53d468-kube-api-access-5k2zp\") pod \"redhat-marketplace-l8xlg\" (UID: \"e448c251-6293-491b-8d15-cd0ebd53d468\") " pod="openshift-marketplace/redhat-marketplace-l8xlg" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.243348 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e448c251-6293-491b-8d15-cd0ebd53d468-catalog-content\") pod \"redhat-marketplace-l8xlg\" (UID: \"e448c251-6293-491b-8d15-cd0ebd53d468\") " pod="openshift-marketplace/redhat-marketplace-l8xlg" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.245192 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e448c251-6293-491b-8d15-cd0ebd53d468-catalog-content\") pod \"redhat-marketplace-l8xlg\" (UID: \"e448c251-6293-491b-8d15-cd0ebd53d468\") " pod="openshift-marketplace/redhat-marketplace-l8xlg" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.245202 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e448c251-6293-491b-8d15-cd0ebd53d468-utilities\") pod \"redhat-marketplace-l8xlg\" (UID: \"e448c251-6293-491b-8d15-cd0ebd53d468\") " pod="openshift-marketplace/redhat-marketplace-l8xlg" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.267955 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k2zp\" (UniqueName: \"kubernetes.io/projected/e448c251-6293-491b-8d15-cd0ebd53d468-kube-api-access-5k2zp\") pod \"redhat-marketplace-l8xlg\" (UID: \"e448c251-6293-491b-8d15-cd0ebd53d468\") " pod="openshift-marketplace/redhat-marketplace-l8xlg" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.289436 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8xlg" Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.336505 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bcsvd"] Mar 18 12:17:50 crc kubenswrapper[4975]: I0318 12:17:50.456094 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8xlg"] Mar 18 12:17:50 crc kubenswrapper[4975]: W0318 12:17:50.460041 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode448c251_6293_491b_8d15_cd0ebd53d468.slice/crio-395f3992c718bd9c0bfe4afcd3d7d2ec8755f920e970599f8fd60f9aacfc7ffa WatchSource:0}: Error finding container 395f3992c718bd9c0bfe4afcd3d7d2ec8755f920e970599f8fd60f9aacfc7ffa: Status 404 returned error can't find the container with id 395f3992c718bd9c0bfe4afcd3d7d2ec8755f920e970599f8fd60f9aacfc7ffa Mar 18 12:17:51 crc kubenswrapper[4975]: I0318 12:17:51.023754 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b9dc77-7653-4684-ba67-cece256c42e2" path="/var/lib/kubelet/pods/21b9dc77-7653-4684-ba67-cece256c42e2/volumes" Mar 18 12:17:51 crc kubenswrapper[4975]: I0318 12:17:51.024551 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b24c4ea-1b55-429c-97f5-376523ea1a52" path="/var/lib/kubelet/pods/3b24c4ea-1b55-429c-97f5-376523ea1a52/volumes" Mar 18 12:17:51 crc kubenswrapper[4975]: I0318 12:17:51.025107 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698cd02e-0279-4ae7-be21-bd479b2dfe49" path="/var/lib/kubelet/pods/698cd02e-0279-4ae7-be21-bd479b2dfe49/volumes" Mar 18 12:17:51 crc kubenswrapper[4975]: I0318 12:17:51.026147 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7a76930-86ba-4055-85e0-6053832da1aa" path="/var/lib/kubelet/pods/a7a76930-86ba-4055-85e0-6053832da1aa/volumes" Mar 18 12:17:51 crc kubenswrapper[4975]: I0318 12:17:51.026710 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d989095d-7ce2-4dd7-ac9e-5c747e900a61" path="/var/lib/kubelet/pods/d989095d-7ce2-4dd7-ac9e-5c747e900a61/volumes" Mar 18 12:17:51 crc kubenswrapper[4975]: I0318 12:17:51.195291 4975 generic.go:334] "Generic (PLEG): container finished" podID="c58752ca-f22d-49b1-ac3c-f3cafb7c26e0" containerID="fd3641e06811303cdd0390e6cef3f5b4995dab60f3684711dad9174379f0699b" exitCode=0 Mar 18 12:17:51 crc kubenswrapper[4975]: I0318 12:17:51.195368 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcsvd" event={"ID":"c58752ca-f22d-49b1-ac3c-f3cafb7c26e0","Type":"ContainerDied","Data":"fd3641e06811303cdd0390e6cef3f5b4995dab60f3684711dad9174379f0699b"} Mar 18 12:17:51 crc kubenswrapper[4975]: I0318 12:17:51.195400 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcsvd" event={"ID":"c58752ca-f22d-49b1-ac3c-f3cafb7c26e0","Type":"ContainerStarted","Data":"47b46c121f69f40139611429732cc5a9b1722c0f7727e1bbc82a3f1c3eac890b"} Mar 18 12:17:51 crc kubenswrapper[4975]: I0318 12:17:51.196820 4975 generic.go:334] "Generic (PLEG): container finished" podID="e448c251-6293-491b-8d15-cd0ebd53d468" containerID="deef84fb0d98dfaa8e0f96410e398fd60d6f5b49cb5e25f358d5a66c7377cd79" exitCode=0 Mar 18 12:17:51 crc kubenswrapper[4975]: I0318 12:17:51.196954 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8xlg" event={"ID":"e448c251-6293-491b-8d15-cd0ebd53d468","Type":"ContainerDied","Data":"deef84fb0d98dfaa8e0f96410e398fd60d6f5b49cb5e25f358d5a66c7377cd79"} Mar 18 12:17:51 crc kubenswrapper[4975]: I0318 12:17:51.196978 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8xlg" event={"ID":"e448c251-6293-491b-8d15-cd0ebd53d468","Type":"ContainerStarted","Data":"395f3992c718bd9c0bfe4afcd3d7d2ec8755f920e970599f8fd60f9aacfc7ffa"} Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.182709 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ttvcz"] Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.185536 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ttvcz" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.187200 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ttvcz"] Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.188009 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.273986 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65-catalog-content\") pod \"community-operators-ttvcz\" (UID: \"b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65\") " pod="openshift-marketplace/community-operators-ttvcz" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.274038 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65-utilities\") pod \"community-operators-ttvcz\" (UID: \"b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65\") " pod="openshift-marketplace/community-operators-ttvcz" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.274183 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr2kh\" (UniqueName: \"kubernetes.io/projected/b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65-kube-api-access-tr2kh\") pod \"community-operators-ttvcz\" (UID: \"b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65\") " pod="openshift-marketplace/community-operators-ttvcz" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.372840 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rnjqq"] Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.374414 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnjqq" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.375265 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr2kh\" (UniqueName: \"kubernetes.io/projected/b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65-kube-api-access-tr2kh\") pod \"community-operators-ttvcz\" (UID: \"b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65\") " pod="openshift-marketplace/community-operators-ttvcz" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.375327 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65-catalog-content\") pod \"community-operators-ttvcz\" (UID: \"b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65\") " pod="openshift-marketplace/community-operators-ttvcz" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.375347 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65-utilities\") pod \"community-operators-ttvcz\" (UID: \"b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65\") " pod="openshift-marketplace/community-operators-ttvcz" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.375723 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65-utilities\") pod \"community-operators-ttvcz\" (UID: \"b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65\") " pod="openshift-marketplace/community-operators-ttvcz" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.375795 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65-catalog-content\") pod \"community-operators-ttvcz\" (UID: \"b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65\") " pod="openshift-marketplace/community-operators-ttvcz" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.377750 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.382114 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rnjqq"] Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.398131 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr2kh\" (UniqueName: \"kubernetes.io/projected/b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65-kube-api-access-tr2kh\") pod \"community-operators-ttvcz\" (UID: \"b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65\") " pod="openshift-marketplace/community-operators-ttvcz" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.476558 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f56e42a-99c9-4b38-bcad-bd1139570888-catalog-content\") pod \"redhat-operators-rnjqq\" (UID: \"1f56e42a-99c9-4b38-bcad-bd1139570888\") " pod="openshift-marketplace/redhat-operators-rnjqq" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.476640 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f56e42a-99c9-4b38-bcad-bd1139570888-utilities\") pod \"redhat-operators-rnjqq\" (UID: \"1f56e42a-99c9-4b38-bcad-bd1139570888\") " pod="openshift-marketplace/redhat-operators-rnjqq" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.476681 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh95b\" (UniqueName: \"kubernetes.io/projected/1f56e42a-99c9-4b38-bcad-bd1139570888-kube-api-access-lh95b\") pod \"redhat-operators-rnjqq\" (UID: \"1f56e42a-99c9-4b38-bcad-bd1139570888\") " pod="openshift-marketplace/redhat-operators-rnjqq" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.503980 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ttvcz" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.578215 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh95b\" (UniqueName: \"kubernetes.io/projected/1f56e42a-99c9-4b38-bcad-bd1139570888-kube-api-access-lh95b\") pod \"redhat-operators-rnjqq\" (UID: \"1f56e42a-99c9-4b38-bcad-bd1139570888\") " pod="openshift-marketplace/redhat-operators-rnjqq" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.578497 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f56e42a-99c9-4b38-bcad-bd1139570888-catalog-content\") pod \"redhat-operators-rnjqq\" (UID: \"1f56e42a-99c9-4b38-bcad-bd1139570888\") " pod="openshift-marketplace/redhat-operators-rnjqq" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.578541 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f56e42a-99c9-4b38-bcad-bd1139570888-utilities\") pod \"redhat-operators-rnjqq\" (UID: \"1f56e42a-99c9-4b38-bcad-bd1139570888\") " pod="openshift-marketplace/redhat-operators-rnjqq" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.579073 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f56e42a-99c9-4b38-bcad-bd1139570888-utilities\") pod \"redhat-operators-rnjqq\" (UID: \"1f56e42a-99c9-4b38-bcad-bd1139570888\") " pod="openshift-marketplace/redhat-operators-rnjqq" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.579146 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f56e42a-99c9-4b38-bcad-bd1139570888-catalog-content\") pod \"redhat-operators-rnjqq\" (UID: \"1f56e42a-99c9-4b38-bcad-bd1139570888\") " pod="openshift-marketplace/redhat-operators-rnjqq" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.600855 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh95b\" (UniqueName: \"kubernetes.io/projected/1f56e42a-99c9-4b38-bcad-bd1139570888-kube-api-access-lh95b\") pod \"redhat-operators-rnjqq\" (UID: \"1f56e42a-99c9-4b38-bcad-bd1139570888\") " pod="openshift-marketplace/redhat-operators-rnjqq" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.730624 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnjqq" Mar 18 12:17:52 crc kubenswrapper[4975]: I0318 12:17:52.905219 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ttvcz"] Mar 18 12:17:53 crc kubenswrapper[4975]: I0318 12:17:53.117715 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rnjqq"] Mar 18 12:17:53 crc kubenswrapper[4975]: W0318 12:17:53.123620 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f56e42a_99c9_4b38_bcad_bd1139570888.slice/crio-3015ce1fb0193ab0805ac6776d0ed6a7b7902116a8c206e086627981fc116943 WatchSource:0}: Error finding container 3015ce1fb0193ab0805ac6776d0ed6a7b7902116a8c206e086627981fc116943: Status 404 returned error can't find the container with id 3015ce1fb0193ab0805ac6776d0ed6a7b7902116a8c206e086627981fc116943 Mar 18 12:17:53 crc kubenswrapper[4975]: I0318 12:17:53.215132 4975 generic.go:334] "Generic (PLEG): container finished" podID="b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65" containerID="3a44c2b05e8f2ebb3c757aa333b21ca49dac8b5da42118e5d3b8a376d4c67891" exitCode=0 Mar 18 12:17:53 crc kubenswrapper[4975]: I0318 12:17:53.215246 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ttvcz" event={"ID":"b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65","Type":"ContainerDied","Data":"3a44c2b05e8f2ebb3c757aa333b21ca49dac8b5da42118e5d3b8a376d4c67891"} Mar 18 12:17:53 crc kubenswrapper[4975]: I0318 12:17:53.215296 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ttvcz" event={"ID":"b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65","Type":"ContainerStarted","Data":"d6b5c9b5b18f44187ea9e3787c169923b8c0ad31154cfad5648029a02d37e319"} Mar 18 12:17:53 crc kubenswrapper[4975]: I0318 12:17:53.222470 4975 generic.go:334] "Generic (PLEG): container finished" podID="c58752ca-f22d-49b1-ac3c-f3cafb7c26e0" containerID="088c4f7e10aaec7ccb8163fb7493aa3fd83104ff4fff4aa01428e066c7622c3c" exitCode=0 Mar 18 12:17:53 crc kubenswrapper[4975]: I0318 12:17:53.222560 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcsvd" event={"ID":"c58752ca-f22d-49b1-ac3c-f3cafb7c26e0","Type":"ContainerDied","Data":"088c4f7e10aaec7ccb8163fb7493aa3fd83104ff4fff4aa01428e066c7622c3c"} Mar 18 12:17:53 crc kubenswrapper[4975]: I0318 12:17:53.225811 4975 generic.go:334] "Generic (PLEG): container finished" podID="e448c251-6293-491b-8d15-cd0ebd53d468" containerID="3bb1f623e1118b2a364ba93c9e82762a10aa216df9ae6daf878f6c1320b0d2cb" exitCode=0 Mar 18 12:17:53 crc kubenswrapper[4975]: I0318 12:17:53.225894 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8xlg" event={"ID":"e448c251-6293-491b-8d15-cd0ebd53d468","Type":"ContainerDied","Data":"3bb1f623e1118b2a364ba93c9e82762a10aa216df9ae6daf878f6c1320b0d2cb"} Mar 18 12:17:53 crc kubenswrapper[4975]: I0318 12:17:53.227251 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnjqq" event={"ID":"1f56e42a-99c9-4b38-bcad-bd1139570888","Type":"ContainerStarted","Data":"3015ce1fb0193ab0805ac6776d0ed6a7b7902116a8c206e086627981fc116943"} Mar 18 12:17:54 crc kubenswrapper[4975]: I0318 12:17:54.233751 4975 generic.go:334] "Generic (PLEG): container finished" podID="1f56e42a-99c9-4b38-bcad-bd1139570888" containerID="a8d689ebd7535db10a15d0e47b04b333120982bfc44b03858fc683f4053ade20" exitCode=0 Mar 18 12:17:54 crc kubenswrapper[4975]: I0318 12:17:54.233858 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnjqq" event={"ID":"1f56e42a-99c9-4b38-bcad-bd1139570888","Type":"ContainerDied","Data":"a8d689ebd7535db10a15d0e47b04b333120982bfc44b03858fc683f4053ade20"} Mar 18 12:17:54 crc kubenswrapper[4975]: I0318 12:17:54.236417 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ttvcz" event={"ID":"b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65","Type":"ContainerStarted","Data":"98c6db77ffc59e1a61c85dda71e9dc304ee69cb646f4eaa08b5700f2c8b141a2"} Mar 18 12:17:54 crc kubenswrapper[4975]: I0318 12:17:54.240217 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcsvd" event={"ID":"c58752ca-f22d-49b1-ac3c-f3cafb7c26e0","Type":"ContainerStarted","Data":"95cbc3b62b79c08d1832b45d252b13917b5978d25d10f5dba6a1837eabad0c69"} Mar 18 12:17:54 crc kubenswrapper[4975]: I0318 12:17:54.242780 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8xlg" event={"ID":"e448c251-6293-491b-8d15-cd0ebd53d468","Type":"ContainerStarted","Data":"1cc1907425ff41427d676e8627c7bd2c67ff402d0f8975d399e4226aa41abba3"} Mar 18 12:17:54 crc kubenswrapper[4975]: I0318 12:17:54.285685 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bcsvd" podStartSLOduration=2.798282753 podStartE2EDuration="5.285662099s" podCreationTimestamp="2026-03-18 12:17:49 +0000 UTC" firstStartedPulling="2026-03-18 12:17:51.196636069 +0000 UTC m=+456.911036648" lastFinishedPulling="2026-03-18 12:17:53.684015415 +0000 UTC m=+459.398415994" observedRunningTime="2026-03-18 12:17:54.281816457 +0000 UTC m=+459.996217036" watchObservedRunningTime="2026-03-18 12:17:54.285662099 +0000 UTC m=+460.000062678" Mar 18 12:17:54 crc kubenswrapper[4975]: I0318 12:17:54.303722 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l8xlg" podStartSLOduration=2.884461198 podStartE2EDuration="5.30370126s" podCreationTimestamp="2026-03-18 12:17:49 +0000 UTC" firstStartedPulling="2026-03-18 12:17:51.198528819 +0000 UTC m=+456.912929388" lastFinishedPulling="2026-03-18 12:17:53.617768871 +0000 UTC m=+459.332169450" observedRunningTime="2026-03-18 12:17:54.300629728 +0000 UTC m=+460.015030307" watchObservedRunningTime="2026-03-18 12:17:54.30370126 +0000 UTC m=+460.018101839" Mar 18 12:17:55 crc kubenswrapper[4975]: I0318 12:17:55.248775 4975 generic.go:334] "Generic (PLEG): container finished" podID="b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65" containerID="98c6db77ffc59e1a61c85dda71e9dc304ee69cb646f4eaa08b5700f2c8b141a2" exitCode=0 Mar 18 12:17:55 crc kubenswrapper[4975]: I0318 12:17:55.248829 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ttvcz" event={"ID":"b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65","Type":"ContainerDied","Data":"98c6db77ffc59e1a61c85dda71e9dc304ee69cb646f4eaa08b5700f2c8b141a2"} Mar 18 12:17:55 crc kubenswrapper[4975]: I0318 12:17:55.253143 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnjqq" event={"ID":"1f56e42a-99c9-4b38-bcad-bd1139570888","Type":"ContainerStarted","Data":"dceebb43d8d2442d6c4c495b7fa1aeb552e141f91e5367749af4457c1de5bf0d"} Mar 18 12:17:55 crc kubenswrapper[4975]: I0318 12:17:55.538955 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:17:55 crc kubenswrapper[4975]: I0318 12:17:55.539010 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:17:55 crc kubenswrapper[4975]: I0318 12:17:55.539057 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:17:55 crc kubenswrapper[4975]: I0318 12:17:55.539527 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2aa3dffbe2fa58483db177c45dee69be9b0aee41d425f82de7fd39aca38b19a7"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:17:55 crc kubenswrapper[4975]: I0318 12:17:55.539588 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://2aa3dffbe2fa58483db177c45dee69be9b0aee41d425f82de7fd39aca38b19a7" gracePeriod=600 Mar 18 12:17:56 crc kubenswrapper[4975]: I0318 12:17:56.260093 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="2aa3dffbe2fa58483db177c45dee69be9b0aee41d425f82de7fd39aca38b19a7" exitCode=0 Mar 18 12:17:56 crc kubenswrapper[4975]: I0318 12:17:56.260167 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"2aa3dffbe2fa58483db177c45dee69be9b0aee41d425f82de7fd39aca38b19a7"} Mar 18 12:17:56 crc kubenswrapper[4975]: I0318 12:17:56.260554 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"b6859c3ab578901559ad44e17b61bbd7da0957c0379c4edd96b716f1fc0c5f64"} Mar 18 12:17:56 crc kubenswrapper[4975]: I0318 12:17:56.260597 4975 scope.go:117] "RemoveContainer" containerID="2ead36c734a58f9c5bf90a72cab92b23a8c2e2d2efe02fe95b50df416422be5b" Mar 18 12:17:56 crc kubenswrapper[4975]: I0318 12:17:56.262733 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ttvcz" event={"ID":"b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65","Type":"ContainerStarted","Data":"94b3ce7738ddd816fd29f07f17b6feb4ac1d689fde8d8ae5a3454f7011ee7a35"} Mar 18 12:17:56 crc kubenswrapper[4975]: I0318 12:17:56.264807 4975 generic.go:334] "Generic (PLEG): container finished" podID="1f56e42a-99c9-4b38-bcad-bd1139570888" containerID="dceebb43d8d2442d6c4c495b7fa1aeb552e141f91e5367749af4457c1de5bf0d" exitCode=0 Mar 18 12:17:56 crc kubenswrapper[4975]: I0318 12:17:56.264843 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnjqq" event={"ID":"1f56e42a-99c9-4b38-bcad-bd1139570888","Type":"ContainerDied","Data":"dceebb43d8d2442d6c4c495b7fa1aeb552e141f91e5367749af4457c1de5bf0d"} Mar 18 12:17:56 crc kubenswrapper[4975]: I0318 12:17:56.303121 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ttvcz" podStartSLOduration=1.862668673 podStartE2EDuration="4.303099269s" podCreationTimestamp="2026-03-18 12:17:52 +0000 UTC" firstStartedPulling="2026-03-18 12:17:53.217637954 +0000 UTC m=+458.932038533" lastFinishedPulling="2026-03-18 12:17:55.65806854 +0000 UTC m=+461.372469129" observedRunningTime="2026-03-18 12:17:56.300400998 +0000 UTC m=+462.014801577" watchObservedRunningTime="2026-03-18 12:17:56.303099269 +0000 UTC m=+462.017499848" Mar 18 12:17:57 crc kubenswrapper[4975]: I0318 12:17:57.275390 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnjqq" event={"ID":"1f56e42a-99c9-4b38-bcad-bd1139570888","Type":"ContainerStarted","Data":"d7d69ffc6b655c1b6a2c166828b6cf45bac136646f2e91054b5a62e8943c48bc"} Mar 18 12:17:57 crc kubenswrapper[4975]: I0318 12:17:57.311636 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rnjqq" podStartSLOduration=2.840079605 podStartE2EDuration="5.31161715s" podCreationTimestamp="2026-03-18 12:17:52 +0000 UTC" firstStartedPulling="2026-03-18 12:17:54.23540115 +0000 UTC m=+459.949801729" lastFinishedPulling="2026-03-18 12:17:56.706938695 +0000 UTC m=+462.421339274" observedRunningTime="2026-03-18 12:17:57.309335569 +0000 UTC m=+463.023736148" watchObservedRunningTime="2026-03-18 12:17:57.31161715 +0000 UTC m=+463.026017729" Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.097709 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bcsvd" Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.098081 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bcsvd" Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.135069 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563938-zxtpf"] Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.135882 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563938-zxtpf" Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.138563 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.139064 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.139290 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.144400 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563938-zxtpf"] Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.161446 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bcsvd" Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.290166 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l8xlg" Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.290202 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l8xlg" Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.313511 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwrh8\" (UniqueName: \"kubernetes.io/projected/2d50feae-ea78-410c-8871-b0eab4e0f73a-kube-api-access-rwrh8\") pod \"auto-csr-approver-29563938-zxtpf\" (UID: \"2d50feae-ea78-410c-8871-b0eab4e0f73a\") " pod="openshift-infra/auto-csr-approver-29563938-zxtpf" Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.325490 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l8xlg" Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.344457 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bcsvd" Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.414952 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwrh8\" (UniqueName: \"kubernetes.io/projected/2d50feae-ea78-410c-8871-b0eab4e0f73a-kube-api-access-rwrh8\") pod \"auto-csr-approver-29563938-zxtpf\" (UID: \"2d50feae-ea78-410c-8871-b0eab4e0f73a\") " pod="openshift-infra/auto-csr-approver-29563938-zxtpf" Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.434452 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwrh8\" (UniqueName: \"kubernetes.io/projected/2d50feae-ea78-410c-8871-b0eab4e0f73a-kube-api-access-rwrh8\") pod \"auto-csr-approver-29563938-zxtpf\" (UID: \"2d50feae-ea78-410c-8871-b0eab4e0f73a\") " pod="openshift-infra/auto-csr-approver-29563938-zxtpf" Mar 18 12:18:00 crc kubenswrapper[4975]: I0318 12:18:00.458361 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563938-zxtpf" Mar 18 12:18:01 crc kubenswrapper[4975]: I0318 12:18:00.670595 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563938-zxtpf"] Mar 18 12:18:01 crc kubenswrapper[4975]: I0318 12:18:01.298540 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563938-zxtpf" event={"ID":"2d50feae-ea78-410c-8871-b0eab4e0f73a","Type":"ContainerStarted","Data":"71d1258cd768bdf24d96d4dc15422feb9d6be169dc71b88a119133a86cc227e4"} Mar 18 12:18:01 crc kubenswrapper[4975]: I0318 12:18:01.343258 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l8xlg" Mar 18 12:18:02 crc kubenswrapper[4975]: I0318 12:18:02.504182 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ttvcz" Mar 18 12:18:02 crc kubenswrapper[4975]: I0318 12:18:02.504585 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ttvcz" Mar 18 12:18:02 crc kubenswrapper[4975]: I0318 12:18:02.544167 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ttvcz" Mar 18 12:18:02 crc kubenswrapper[4975]: I0318 12:18:02.731817 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rnjqq" Mar 18 12:18:02 crc kubenswrapper[4975]: I0318 12:18:02.731912 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rnjqq" Mar 18 12:18:03 crc kubenswrapper[4975]: I0318 12:18:03.374980 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ttvcz" Mar 18 12:18:03 crc kubenswrapper[4975]: I0318 12:18:03.772289 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rnjqq" podUID="1f56e42a-99c9-4b38-bcad-bd1139570888" containerName="registry-server" probeResult="failure" output=< Mar 18 12:18:03 crc kubenswrapper[4975]: timeout: failed to connect service ":50051" within 1s Mar 18 12:18:03 crc kubenswrapper[4975]: > Mar 18 12:18:05 crc kubenswrapper[4975]: I0318 12:18:05.322858 4975 generic.go:334] "Generic (PLEG): container finished" podID="2d50feae-ea78-410c-8871-b0eab4e0f73a" containerID="10f86091ca1e66196840401f5ec7ca9c0b3ed4cb87b29080000aaa7283ad0894" exitCode=0 Mar 18 12:18:05 crc kubenswrapper[4975]: I0318 12:18:05.322997 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563938-zxtpf" event={"ID":"2d50feae-ea78-410c-8871-b0eab4e0f73a","Type":"ContainerDied","Data":"10f86091ca1e66196840401f5ec7ca9c0b3ed4cb87b29080000aaa7283ad0894"} Mar 18 12:18:06 crc kubenswrapper[4975]: I0318 12:18:06.562367 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563938-zxtpf" Mar 18 12:18:06 crc kubenswrapper[4975]: I0318 12:18:06.722926 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwrh8\" (UniqueName: \"kubernetes.io/projected/2d50feae-ea78-410c-8871-b0eab4e0f73a-kube-api-access-rwrh8\") pod \"2d50feae-ea78-410c-8871-b0eab4e0f73a\" (UID: \"2d50feae-ea78-410c-8871-b0eab4e0f73a\") " Mar 18 12:18:06 crc kubenswrapper[4975]: I0318 12:18:06.729483 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d50feae-ea78-410c-8871-b0eab4e0f73a-kube-api-access-rwrh8" (OuterVolumeSpecName: "kube-api-access-rwrh8") pod "2d50feae-ea78-410c-8871-b0eab4e0f73a" (UID: "2d50feae-ea78-410c-8871-b0eab4e0f73a"). InnerVolumeSpecName "kube-api-access-rwrh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:18:06 crc kubenswrapper[4975]: I0318 12:18:06.824065 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwrh8\" (UniqueName: \"kubernetes.io/projected/2d50feae-ea78-410c-8871-b0eab4e0f73a-kube-api-access-rwrh8\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:07 crc kubenswrapper[4975]: I0318 12:18:07.335809 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563938-zxtpf" event={"ID":"2d50feae-ea78-410c-8871-b0eab4e0f73a","Type":"ContainerDied","Data":"71d1258cd768bdf24d96d4dc15422feb9d6be169dc71b88a119133a86cc227e4"} Mar 18 12:18:07 crc kubenswrapper[4975]: I0318 12:18:07.335854 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71d1258cd768bdf24d96d4dc15422feb9d6be169dc71b88a119133a86cc227e4" Mar 18 12:18:07 crc kubenswrapper[4975]: I0318 12:18:07.335885 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563938-zxtpf" Mar 18 12:18:07 crc kubenswrapper[4975]: I0318 12:18:07.615834 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563932-9m82r"] Mar 18 12:18:07 crc kubenswrapper[4975]: I0318 12:18:07.618950 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563932-9m82r"] Mar 18 12:18:09 crc kubenswrapper[4975]: I0318 12:18:09.024902 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb5204e4-c110-485b-8627-807fdb7f4c27" path="/var/lib/kubelet/pods/bb5204e4-c110-485b-8627-807fdb7f4c27/volumes" Mar 18 12:18:12 crc kubenswrapper[4975]: I0318 12:18:12.779256 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rnjqq" Mar 18 12:18:12 crc kubenswrapper[4975]: I0318 12:18:12.821133 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rnjqq" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.184952 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" podUID="470110ba-97b8-4d8f-a8da-0df16cd7abed" containerName="registry" containerID="cri-o://2d2880fde3dee7514ded4039414b9897c12e6fe58953347d1f740dda0fadde26" gracePeriod=30 Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.381155 4975 generic.go:334] "Generic (PLEG): container finished" podID="470110ba-97b8-4d8f-a8da-0df16cd7abed" containerID="2d2880fde3dee7514ded4039414b9897c12e6fe58953347d1f740dda0fadde26" exitCode=0 Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.381211 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" event={"ID":"470110ba-97b8-4d8f-a8da-0df16cd7abed","Type":"ContainerDied","Data":"2d2880fde3dee7514ded4039414b9897c12e6fe58953347d1f740dda0fadde26"} Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.549518 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.735028 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/470110ba-97b8-4d8f-a8da-0df16cd7abed-trusted-ca\") pod \"470110ba-97b8-4d8f-a8da-0df16cd7abed\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.735279 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"470110ba-97b8-4d8f-a8da-0df16cd7abed\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.735312 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/470110ba-97b8-4d8f-a8da-0df16cd7abed-installation-pull-secrets\") pod \"470110ba-97b8-4d8f-a8da-0df16cd7abed\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.735335 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-registry-tls\") pod \"470110ba-97b8-4d8f-a8da-0df16cd7abed\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.735372 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/470110ba-97b8-4d8f-a8da-0df16cd7abed-ca-trust-extracted\") pod \"470110ba-97b8-4d8f-a8da-0df16cd7abed\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.735393 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-bound-sa-token\") pod \"470110ba-97b8-4d8f-a8da-0df16cd7abed\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.735416 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/470110ba-97b8-4d8f-a8da-0df16cd7abed-registry-certificates\") pod \"470110ba-97b8-4d8f-a8da-0df16cd7abed\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.735441 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5ng\" (UniqueName: \"kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-kube-api-access-vt5ng\") pod \"470110ba-97b8-4d8f-a8da-0df16cd7abed\" (UID: \"470110ba-97b8-4d8f-a8da-0df16cd7abed\") " Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.735632 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/470110ba-97b8-4d8f-a8da-0df16cd7abed-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "470110ba-97b8-4d8f-a8da-0df16cd7abed" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.736485 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/470110ba-97b8-4d8f-a8da-0df16cd7abed-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "470110ba-97b8-4d8f-a8da-0df16cd7abed" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.740677 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-kube-api-access-vt5ng" (OuterVolumeSpecName: "kube-api-access-vt5ng") pod "470110ba-97b8-4d8f-a8da-0df16cd7abed" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed"). InnerVolumeSpecName "kube-api-access-vt5ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.742385 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "470110ba-97b8-4d8f-a8da-0df16cd7abed" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.742714 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "470110ba-97b8-4d8f-a8da-0df16cd7abed" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.743612 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/470110ba-97b8-4d8f-a8da-0df16cd7abed-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "470110ba-97b8-4d8f-a8da-0df16cd7abed" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.757301 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/470110ba-97b8-4d8f-a8da-0df16cd7abed-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "470110ba-97b8-4d8f-a8da-0df16cd7abed" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.760491 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "470110ba-97b8-4d8f-a8da-0df16cd7abed" (UID: "470110ba-97b8-4d8f-a8da-0df16cd7abed"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.836297 4975 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/470110ba-97b8-4d8f-a8da-0df16cd7abed-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.836334 4975 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.836345 4975 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/470110ba-97b8-4d8f-a8da-0df16cd7abed-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.836354 4975 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/470110ba-97b8-4d8f-a8da-0df16cd7abed-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.836362 4975 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.836370 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5ng\" (UniqueName: \"kubernetes.io/projected/470110ba-97b8-4d8f-a8da-0df16cd7abed-kube-api-access-vt5ng\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:15 crc kubenswrapper[4975]: I0318 12:18:15.836378 4975 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/470110ba-97b8-4d8f-a8da-0df16cd7abed-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:18:16 crc kubenswrapper[4975]: I0318 12:18:16.387792 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" event={"ID":"470110ba-97b8-4d8f-a8da-0df16cd7abed","Type":"ContainerDied","Data":"cdc38385728bbce1ca828ff7d51d70b3ec0e966bc0d25a7e54546ebe8fa46a84"} Mar 18 12:18:16 crc kubenswrapper[4975]: I0318 12:18:16.387844 4975 scope.go:117] "RemoveContainer" containerID="2d2880fde3dee7514ded4039414b9897c12e6fe58953347d1f740dda0fadde26" Mar 18 12:18:16 crc kubenswrapper[4975]: I0318 12:18:16.387953 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8nsht" Mar 18 12:18:16 crc kubenswrapper[4975]: I0318 12:18:16.418752 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8nsht"] Mar 18 12:18:16 crc kubenswrapper[4975]: I0318 12:18:16.422706 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8nsht"] Mar 18 12:18:17 crc kubenswrapper[4975]: I0318 12:18:17.025752 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="470110ba-97b8-4d8f-a8da-0df16cd7abed" path="/var/lib/kubelet/pods/470110ba-97b8-4d8f-a8da-0df16cd7abed/volumes" Mar 18 12:19:55 crc kubenswrapper[4975]: I0318 12:19:55.538710 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:19:55 crc kubenswrapper[4975]: I0318 12:19:55.539381 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.132833 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563940-r8w7g"] Mar 18 12:20:00 crc kubenswrapper[4975]: E0318 12:20:00.133350 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470110ba-97b8-4d8f-a8da-0df16cd7abed" containerName="registry" Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.133364 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="470110ba-97b8-4d8f-a8da-0df16cd7abed" containerName="registry" Mar 18 12:20:00 crc kubenswrapper[4975]: E0318 12:20:00.133380 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d50feae-ea78-410c-8871-b0eab4e0f73a" containerName="oc" Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.133389 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d50feae-ea78-410c-8871-b0eab4e0f73a" containerName="oc" Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.133509 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="470110ba-97b8-4d8f-a8da-0df16cd7abed" containerName="registry" Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.133522 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d50feae-ea78-410c-8871-b0eab4e0f73a" containerName="oc" Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.134008 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563940-r8w7g" Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.137166 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.140466 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.140474 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.147314 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563940-r8w7g"] Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.252807 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl2zg\" (UniqueName: \"kubernetes.io/projected/edc54b26-3dff-4ab9-b02b-2468b13d95e2-kube-api-access-zl2zg\") pod \"auto-csr-approver-29563940-r8w7g\" (UID: \"edc54b26-3dff-4ab9-b02b-2468b13d95e2\") " pod="openshift-infra/auto-csr-approver-29563940-r8w7g" Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.354057 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl2zg\" (UniqueName: \"kubernetes.io/projected/edc54b26-3dff-4ab9-b02b-2468b13d95e2-kube-api-access-zl2zg\") pod \"auto-csr-approver-29563940-r8w7g\" (UID: \"edc54b26-3dff-4ab9-b02b-2468b13d95e2\") " pod="openshift-infra/auto-csr-approver-29563940-r8w7g" Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.376083 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl2zg\" (UniqueName: \"kubernetes.io/projected/edc54b26-3dff-4ab9-b02b-2468b13d95e2-kube-api-access-zl2zg\") pod \"auto-csr-approver-29563940-r8w7g\" (UID: \"edc54b26-3dff-4ab9-b02b-2468b13d95e2\") " pod="openshift-infra/auto-csr-approver-29563940-r8w7g" Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.458329 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563940-r8w7g" Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.656348 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563940-r8w7g"] Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.667950 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:20:00 crc kubenswrapper[4975]: I0318 12:20:00.948560 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563940-r8w7g" event={"ID":"edc54b26-3dff-4ab9-b02b-2468b13d95e2","Type":"ContainerStarted","Data":"dfd7f8f1c6267a3450a68c8c9788247a303882a536e7aedd477c7c8e8481b208"} Mar 18 12:20:03 crc kubenswrapper[4975]: I0318 12:20:03.965994 4975 generic.go:334] "Generic (PLEG): container finished" podID="edc54b26-3dff-4ab9-b02b-2468b13d95e2" containerID="4b60948cdf8aeeaba6d945a9b8b29367eb4aef2312b6fc54bc0322aa6690b760" exitCode=0 Mar 18 12:20:03 crc kubenswrapper[4975]: I0318 12:20:03.966109 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563940-r8w7g" event={"ID":"edc54b26-3dff-4ab9-b02b-2468b13d95e2","Type":"ContainerDied","Data":"4b60948cdf8aeeaba6d945a9b8b29367eb4aef2312b6fc54bc0322aa6690b760"} Mar 18 12:20:05 crc kubenswrapper[4975]: I0318 12:20:05.179505 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563940-r8w7g" Mar 18 12:20:05 crc kubenswrapper[4975]: I0318 12:20:05.316658 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl2zg\" (UniqueName: \"kubernetes.io/projected/edc54b26-3dff-4ab9-b02b-2468b13d95e2-kube-api-access-zl2zg\") pod \"edc54b26-3dff-4ab9-b02b-2468b13d95e2\" (UID: \"edc54b26-3dff-4ab9-b02b-2468b13d95e2\") " Mar 18 12:20:05 crc kubenswrapper[4975]: I0318 12:20:05.322070 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edc54b26-3dff-4ab9-b02b-2468b13d95e2-kube-api-access-zl2zg" (OuterVolumeSpecName: "kube-api-access-zl2zg") pod "edc54b26-3dff-4ab9-b02b-2468b13d95e2" (UID: "edc54b26-3dff-4ab9-b02b-2468b13d95e2"). InnerVolumeSpecName "kube-api-access-zl2zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:20:05 crc kubenswrapper[4975]: I0318 12:20:05.417570 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl2zg\" (UniqueName: \"kubernetes.io/projected/edc54b26-3dff-4ab9-b02b-2468b13d95e2-kube-api-access-zl2zg\") on node \"crc\" DevicePath \"\"" Mar 18 12:20:05 crc kubenswrapper[4975]: I0318 12:20:05.977594 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563940-r8w7g" event={"ID":"edc54b26-3dff-4ab9-b02b-2468b13d95e2","Type":"ContainerDied","Data":"dfd7f8f1c6267a3450a68c8c9788247a303882a536e7aedd477c7c8e8481b208"} Mar 18 12:20:05 crc kubenswrapper[4975]: I0318 12:20:05.977636 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfd7f8f1c6267a3450a68c8c9788247a303882a536e7aedd477c7c8e8481b208" Mar 18 12:20:05 crc kubenswrapper[4975]: I0318 12:20:05.977859 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563940-r8w7g" Mar 18 12:20:06 crc kubenswrapper[4975]: I0318 12:20:06.238205 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563934-6j5kv"] Mar 18 12:20:06 crc kubenswrapper[4975]: I0318 12:20:06.242173 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563934-6j5kv"] Mar 18 12:20:07 crc kubenswrapper[4975]: I0318 12:20:07.025754 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d72ac7c-4ce8-4d23-a845-d359bca0544a" path="/var/lib/kubelet/pods/9d72ac7c-4ce8-4d23-a845-d359bca0544a/volumes" Mar 18 12:20:25 crc kubenswrapper[4975]: I0318 12:20:25.538629 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:20:25 crc kubenswrapper[4975]: I0318 12:20:25.539182 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:20:55 crc kubenswrapper[4975]: I0318 12:20:55.538802 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:20:55 crc kubenswrapper[4975]: I0318 12:20:55.539511 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:20:55 crc kubenswrapper[4975]: I0318 12:20:55.539563 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:20:55 crc kubenswrapper[4975]: I0318 12:20:55.540186 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b6859c3ab578901559ad44e17b61bbd7da0957c0379c4edd96b716f1fc0c5f64"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:20:55 crc kubenswrapper[4975]: I0318 12:20:55.540245 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://b6859c3ab578901559ad44e17b61bbd7da0957c0379c4edd96b716f1fc0c5f64" gracePeriod=600 Mar 18 12:20:56 crc kubenswrapper[4975]: I0318 12:20:56.267655 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="b6859c3ab578901559ad44e17b61bbd7da0957c0379c4edd96b716f1fc0c5f64" exitCode=0 Mar 18 12:20:56 crc kubenswrapper[4975]: I0318 12:20:56.267720 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"b6859c3ab578901559ad44e17b61bbd7da0957c0379c4edd96b716f1fc0c5f64"} Mar 18 12:20:56 crc kubenswrapper[4975]: I0318 12:20:56.267949 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"652d3462a6e10a47d996fdbb6d3a4cc821e9e3b750eb8fdb8c2f0cb2935587d4"} Mar 18 12:20:56 crc kubenswrapper[4975]: I0318 12:20:56.267971 4975 scope.go:117] "RemoveContainer" containerID="2aa3dffbe2fa58483db177c45dee69be9b0aee41d425f82de7fd39aca38b19a7" Mar 18 12:21:33 crc kubenswrapper[4975]: I0318 12:21:33.040457 4975 scope.go:117] "RemoveContainer" containerID="995fbfe1ae1d68d3875e30a6a541d70b4b174153aa38e91ecf31cea63b7cffbc" Mar 18 12:21:33 crc kubenswrapper[4975]: I0318 12:21:33.075690 4975 scope.go:117] "RemoveContainer" containerID="0b76564616e00fa41708e5f77a628228d8677811e5c8d0d7dc7db75a2883620f" Mar 18 12:22:00 crc kubenswrapper[4975]: I0318 12:22:00.138771 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563942-nqstt"] Mar 18 12:22:00 crc kubenswrapper[4975]: E0318 12:22:00.139574 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc54b26-3dff-4ab9-b02b-2468b13d95e2" containerName="oc" Mar 18 12:22:00 crc kubenswrapper[4975]: I0318 12:22:00.139592 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc54b26-3dff-4ab9-b02b-2468b13d95e2" containerName="oc" Mar 18 12:22:00 crc kubenswrapper[4975]: I0318 12:22:00.139715 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc54b26-3dff-4ab9-b02b-2468b13d95e2" containerName="oc" Mar 18 12:22:00 crc kubenswrapper[4975]: I0318 12:22:00.140148 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563942-nqstt" Mar 18 12:22:00 crc kubenswrapper[4975]: I0318 12:22:00.143468 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:22:00 crc kubenswrapper[4975]: I0318 12:22:00.143607 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:22:00 crc kubenswrapper[4975]: I0318 12:22:00.143749 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:22:00 crc kubenswrapper[4975]: I0318 12:22:00.148841 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563942-nqstt"] Mar 18 12:22:00 crc kubenswrapper[4975]: I0318 12:22:00.244341 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgnxk\" (UniqueName: \"kubernetes.io/projected/c77106cf-2987-45dc-ad75-98f5c4ae2fd7-kube-api-access-lgnxk\") pod \"auto-csr-approver-29563942-nqstt\" (UID: \"c77106cf-2987-45dc-ad75-98f5c4ae2fd7\") " pod="openshift-infra/auto-csr-approver-29563942-nqstt" Mar 18 12:22:00 crc kubenswrapper[4975]: I0318 12:22:00.345159 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgnxk\" (UniqueName: \"kubernetes.io/projected/c77106cf-2987-45dc-ad75-98f5c4ae2fd7-kube-api-access-lgnxk\") pod \"auto-csr-approver-29563942-nqstt\" (UID: \"c77106cf-2987-45dc-ad75-98f5c4ae2fd7\") " pod="openshift-infra/auto-csr-approver-29563942-nqstt" Mar 18 12:22:00 crc kubenswrapper[4975]: I0318 12:22:00.363606 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgnxk\" (UniqueName: \"kubernetes.io/projected/c77106cf-2987-45dc-ad75-98f5c4ae2fd7-kube-api-access-lgnxk\") pod \"auto-csr-approver-29563942-nqstt\" (UID: \"c77106cf-2987-45dc-ad75-98f5c4ae2fd7\") " pod="openshift-infra/auto-csr-approver-29563942-nqstt" Mar 18 12:22:00 crc kubenswrapper[4975]: I0318 12:22:00.459186 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563942-nqstt" Mar 18 12:22:00 crc kubenswrapper[4975]: I0318 12:22:00.638442 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563942-nqstt"] Mar 18 12:22:01 crc kubenswrapper[4975]: I0318 12:22:01.624794 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563942-nqstt" event={"ID":"c77106cf-2987-45dc-ad75-98f5c4ae2fd7","Type":"ContainerStarted","Data":"d64875bea69d0f6626bfc9ca8dd2f1b46e876dc9236e639ed735f84c513328cf"} Mar 18 12:22:02 crc kubenswrapper[4975]: I0318 12:22:02.631789 4975 generic.go:334] "Generic (PLEG): container finished" podID="c77106cf-2987-45dc-ad75-98f5c4ae2fd7" containerID="a098c1347c1b344b05abafccb65e4ef8704765d31e25534b2257fdd72cd1b93e" exitCode=0 Mar 18 12:22:02 crc kubenswrapper[4975]: I0318 12:22:02.631959 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563942-nqstt" event={"ID":"c77106cf-2987-45dc-ad75-98f5c4ae2fd7","Type":"ContainerDied","Data":"a098c1347c1b344b05abafccb65e4ef8704765d31e25534b2257fdd72cd1b93e"} Mar 18 12:22:03 crc kubenswrapper[4975]: I0318 12:22:03.824661 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563942-nqstt" Mar 18 12:22:03 crc kubenswrapper[4975]: I0318 12:22:03.984581 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgnxk\" (UniqueName: \"kubernetes.io/projected/c77106cf-2987-45dc-ad75-98f5c4ae2fd7-kube-api-access-lgnxk\") pod \"c77106cf-2987-45dc-ad75-98f5c4ae2fd7\" (UID: \"c77106cf-2987-45dc-ad75-98f5c4ae2fd7\") " Mar 18 12:22:03 crc kubenswrapper[4975]: I0318 12:22:03.990356 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c77106cf-2987-45dc-ad75-98f5c4ae2fd7-kube-api-access-lgnxk" (OuterVolumeSpecName: "kube-api-access-lgnxk") pod "c77106cf-2987-45dc-ad75-98f5c4ae2fd7" (UID: "c77106cf-2987-45dc-ad75-98f5c4ae2fd7"). InnerVolumeSpecName "kube-api-access-lgnxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:22:04 crc kubenswrapper[4975]: I0318 12:22:04.085966 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgnxk\" (UniqueName: \"kubernetes.io/projected/c77106cf-2987-45dc-ad75-98f5c4ae2fd7-kube-api-access-lgnxk\") on node \"crc\" DevicePath \"\"" Mar 18 12:22:04 crc kubenswrapper[4975]: I0318 12:22:04.643353 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563942-nqstt" event={"ID":"c77106cf-2987-45dc-ad75-98f5c4ae2fd7","Type":"ContainerDied","Data":"d64875bea69d0f6626bfc9ca8dd2f1b46e876dc9236e639ed735f84c513328cf"} Mar 18 12:22:04 crc kubenswrapper[4975]: I0318 12:22:04.643394 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d64875bea69d0f6626bfc9ca8dd2f1b46e876dc9236e639ed735f84c513328cf" Mar 18 12:22:04 crc kubenswrapper[4975]: I0318 12:22:04.643452 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563942-nqstt" Mar 18 12:22:04 crc kubenswrapper[4975]: I0318 12:22:04.880657 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563936-4kt6x"] Mar 18 12:22:04 crc kubenswrapper[4975]: I0318 12:22:04.883953 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563936-4kt6x"] Mar 18 12:22:05 crc kubenswrapper[4975]: I0318 12:22:05.023222 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="649231fb-eaf6-45ff-9393-3df37c421381" path="/var/lib/kubelet/pods/649231fb-eaf6-45ff-9393-3df37c421381/volumes" Mar 18 12:22:33 crc kubenswrapper[4975]: I0318 12:22:33.135766 4975 scope.go:117] "RemoveContainer" containerID="20a35710d4e8f5e8d3b3a617c84ab866426452116db57c4b93153314d70f1d2a" Mar 18 12:22:55 crc kubenswrapper[4975]: I0318 12:22:55.539136 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:22:55 crc kubenswrapper[4975]: I0318 12:22:55.539774 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:23:25 crc kubenswrapper[4975]: I0318 12:23:25.538373 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:23:25 crc kubenswrapper[4975]: I0318 12:23:25.538939 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:23:55 crc kubenswrapper[4975]: I0318 12:23:55.538636 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:23:55 crc kubenswrapper[4975]: I0318 12:23:55.539299 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:23:55 crc kubenswrapper[4975]: I0318 12:23:55.539383 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:23:55 crc kubenswrapper[4975]: I0318 12:23:55.541033 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"652d3462a6e10a47d996fdbb6d3a4cc821e9e3b750eb8fdb8c2f0cb2935587d4"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:23:55 crc kubenswrapper[4975]: I0318 12:23:55.541119 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://652d3462a6e10a47d996fdbb6d3a4cc821e9e3b750eb8fdb8c2f0cb2935587d4" gracePeriod=600 Mar 18 12:23:56 crc kubenswrapper[4975]: I0318 12:23:56.216365 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="652d3462a6e10a47d996fdbb6d3a4cc821e9e3b750eb8fdb8c2f0cb2935587d4" exitCode=0 Mar 18 12:23:56 crc kubenswrapper[4975]: I0318 12:23:56.216431 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"652d3462a6e10a47d996fdbb6d3a4cc821e9e3b750eb8fdb8c2f0cb2935587d4"} Mar 18 12:23:56 crc kubenswrapper[4975]: I0318 12:23:56.216919 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"142b541a7de05fd46269c85cd3392d764eb097aeaa954b82530c1118b45a05b8"} Mar 18 12:23:56 crc kubenswrapper[4975]: I0318 12:23:56.216941 4975 scope.go:117] "RemoveContainer" containerID="b6859c3ab578901559ad44e17b61bbd7da0957c0379c4edd96b716f1fc0c5f64" Mar 18 12:24:00 crc kubenswrapper[4975]: I0318 12:24:00.137064 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563944-xm5fl"] Mar 18 12:24:00 crc kubenswrapper[4975]: E0318 12:24:00.137899 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c77106cf-2987-45dc-ad75-98f5c4ae2fd7" containerName="oc" Mar 18 12:24:00 crc kubenswrapper[4975]: I0318 12:24:00.137915 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77106cf-2987-45dc-ad75-98f5c4ae2fd7" containerName="oc" Mar 18 12:24:00 crc kubenswrapper[4975]: I0318 12:24:00.138067 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="c77106cf-2987-45dc-ad75-98f5c4ae2fd7" containerName="oc" Mar 18 12:24:00 crc kubenswrapper[4975]: I0318 12:24:00.138501 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563944-xm5fl" Mar 18 12:24:00 crc kubenswrapper[4975]: I0318 12:24:00.141011 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:24:00 crc kubenswrapper[4975]: I0318 12:24:00.141279 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:24:00 crc kubenswrapper[4975]: I0318 12:24:00.141302 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:24:00 crc kubenswrapper[4975]: I0318 12:24:00.153696 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563944-xm5fl"] Mar 18 12:24:00 crc kubenswrapper[4975]: I0318 12:24:00.242947 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwf6l\" (UniqueName: \"kubernetes.io/projected/faf85139-fd24-430e-a781-054357d8c8dc-kube-api-access-jwf6l\") pod \"auto-csr-approver-29563944-xm5fl\" (UID: \"faf85139-fd24-430e-a781-054357d8c8dc\") " pod="openshift-infra/auto-csr-approver-29563944-xm5fl" Mar 18 12:24:00 crc kubenswrapper[4975]: I0318 12:24:00.344118 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwf6l\" (UniqueName: \"kubernetes.io/projected/faf85139-fd24-430e-a781-054357d8c8dc-kube-api-access-jwf6l\") pod \"auto-csr-approver-29563944-xm5fl\" (UID: \"faf85139-fd24-430e-a781-054357d8c8dc\") " pod="openshift-infra/auto-csr-approver-29563944-xm5fl" Mar 18 12:24:00 crc kubenswrapper[4975]: I0318 12:24:00.361629 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwf6l\" (UniqueName: \"kubernetes.io/projected/faf85139-fd24-430e-a781-054357d8c8dc-kube-api-access-jwf6l\") pod \"auto-csr-approver-29563944-xm5fl\" (UID: \"faf85139-fd24-430e-a781-054357d8c8dc\") " pod="openshift-infra/auto-csr-approver-29563944-xm5fl" Mar 18 12:24:00 crc kubenswrapper[4975]: I0318 12:24:00.455615 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563944-xm5fl" Mar 18 12:24:00 crc kubenswrapper[4975]: I0318 12:24:00.634902 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563944-xm5fl"] Mar 18 12:24:00 crc kubenswrapper[4975]: W0318 12:24:00.641249 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaf85139_fd24_430e_a781_054357d8c8dc.slice/crio-4e58d45a8298349b3609047b0495160c9104622927ffc8e2803208d615251bc6 WatchSource:0}: Error finding container 4e58d45a8298349b3609047b0495160c9104622927ffc8e2803208d615251bc6: Status 404 returned error can't find the container with id 4e58d45a8298349b3609047b0495160c9104622927ffc8e2803208d615251bc6 Mar 18 12:24:01 crc kubenswrapper[4975]: I0318 12:24:01.247893 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563944-xm5fl" event={"ID":"faf85139-fd24-430e-a781-054357d8c8dc","Type":"ContainerStarted","Data":"4e58d45a8298349b3609047b0495160c9104622927ffc8e2803208d615251bc6"} Mar 18 12:24:02 crc kubenswrapper[4975]: I0318 12:24:02.255737 4975 generic.go:334] "Generic (PLEG): container finished" podID="faf85139-fd24-430e-a781-054357d8c8dc" containerID="69ed0d22369b45cd29e2c7d7596fb5dd276e530a132ce4e064441a91a32c3fc8" exitCode=0 Mar 18 12:24:02 crc kubenswrapper[4975]: I0318 12:24:02.256164 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563944-xm5fl" event={"ID":"faf85139-fd24-430e-a781-054357d8c8dc","Type":"ContainerDied","Data":"69ed0d22369b45cd29e2c7d7596fb5dd276e530a132ce4e064441a91a32c3fc8"} Mar 18 12:24:03 crc kubenswrapper[4975]: I0318 12:24:03.439654 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563944-xm5fl" Mar 18 12:24:03 crc kubenswrapper[4975]: I0318 12:24:03.583740 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwf6l\" (UniqueName: \"kubernetes.io/projected/faf85139-fd24-430e-a781-054357d8c8dc-kube-api-access-jwf6l\") pod \"faf85139-fd24-430e-a781-054357d8c8dc\" (UID: \"faf85139-fd24-430e-a781-054357d8c8dc\") " Mar 18 12:24:03 crc kubenswrapper[4975]: I0318 12:24:03.591092 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faf85139-fd24-430e-a781-054357d8c8dc-kube-api-access-jwf6l" (OuterVolumeSpecName: "kube-api-access-jwf6l") pod "faf85139-fd24-430e-a781-054357d8c8dc" (UID: "faf85139-fd24-430e-a781-054357d8c8dc"). InnerVolumeSpecName "kube-api-access-jwf6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:24:03 crc kubenswrapper[4975]: I0318 12:24:03.685309 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwf6l\" (UniqueName: \"kubernetes.io/projected/faf85139-fd24-430e-a781-054357d8c8dc-kube-api-access-jwf6l\") on node \"crc\" DevicePath \"\"" Mar 18 12:24:04 crc kubenswrapper[4975]: I0318 12:24:04.275376 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563944-xm5fl" event={"ID":"faf85139-fd24-430e-a781-054357d8c8dc","Type":"ContainerDied","Data":"4e58d45a8298349b3609047b0495160c9104622927ffc8e2803208d615251bc6"} Mar 18 12:24:04 crc kubenswrapper[4975]: I0318 12:24:04.275424 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e58d45a8298349b3609047b0495160c9104622927ffc8e2803208d615251bc6" Mar 18 12:24:04 crc kubenswrapper[4975]: I0318 12:24:04.275454 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563944-xm5fl" Mar 18 12:24:04 crc kubenswrapper[4975]: I0318 12:24:04.488510 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563938-zxtpf"] Mar 18 12:24:04 crc kubenswrapper[4975]: I0318 12:24:04.491558 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563938-zxtpf"] Mar 18 12:24:05 crc kubenswrapper[4975]: I0318 12:24:05.023002 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d50feae-ea78-410c-8871-b0eab4e0f73a" path="/var/lib/kubelet/pods/2d50feae-ea78-410c-8871-b0eab4e0f73a/volumes" Mar 18 12:24:06 crc kubenswrapper[4975]: I0318 12:24:06.894908 4975 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 12:24:33 crc kubenswrapper[4975]: I0318 12:24:33.193063 4975 scope.go:117] "RemoveContainer" containerID="10f86091ca1e66196840401f5ec7ca9c0b3ed4cb87b29080000aaa7283ad0894" Mar 18 12:25:55 crc kubenswrapper[4975]: I0318 12:25:55.538283 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:25:55 crc kubenswrapper[4975]: I0318 12:25:55.538792 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:26:00 crc kubenswrapper[4975]: I0318 12:26:00.143557 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563946-p6cjf"] Mar 18 12:26:00 crc kubenswrapper[4975]: E0318 12:26:00.144108 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faf85139-fd24-430e-a781-054357d8c8dc" containerName="oc" Mar 18 12:26:00 crc kubenswrapper[4975]: I0318 12:26:00.144125 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="faf85139-fd24-430e-a781-054357d8c8dc" containerName="oc" Mar 18 12:26:00 crc kubenswrapper[4975]: I0318 12:26:00.144252 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="faf85139-fd24-430e-a781-054357d8c8dc" containerName="oc" Mar 18 12:26:00 crc kubenswrapper[4975]: I0318 12:26:00.145418 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563946-p6cjf" Mar 18 12:26:00 crc kubenswrapper[4975]: I0318 12:26:00.149208 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:26:00 crc kubenswrapper[4975]: I0318 12:26:00.150568 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:26:00 crc kubenswrapper[4975]: I0318 12:26:00.150818 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:26:00 crc kubenswrapper[4975]: I0318 12:26:00.153590 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563946-p6cjf"] Mar 18 12:26:00 crc kubenswrapper[4975]: I0318 12:26:00.258505 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shfv7\" (UniqueName: \"kubernetes.io/projected/20335645-e96e-4255-b3c9-1f7da0921398-kube-api-access-shfv7\") pod \"auto-csr-approver-29563946-p6cjf\" (UID: \"20335645-e96e-4255-b3c9-1f7da0921398\") " pod="openshift-infra/auto-csr-approver-29563946-p6cjf" Mar 18 12:26:00 crc kubenswrapper[4975]: I0318 12:26:00.360333 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shfv7\" (UniqueName: \"kubernetes.io/projected/20335645-e96e-4255-b3c9-1f7da0921398-kube-api-access-shfv7\") pod \"auto-csr-approver-29563946-p6cjf\" (UID: \"20335645-e96e-4255-b3c9-1f7da0921398\") " pod="openshift-infra/auto-csr-approver-29563946-p6cjf" Mar 18 12:26:00 crc kubenswrapper[4975]: I0318 12:26:00.377636 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shfv7\" (UniqueName: \"kubernetes.io/projected/20335645-e96e-4255-b3c9-1f7da0921398-kube-api-access-shfv7\") pod \"auto-csr-approver-29563946-p6cjf\" (UID: \"20335645-e96e-4255-b3c9-1f7da0921398\") " pod="openshift-infra/auto-csr-approver-29563946-p6cjf" Mar 18 12:26:00 crc kubenswrapper[4975]: I0318 12:26:00.466033 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563946-p6cjf" Mar 18 12:26:00 crc kubenswrapper[4975]: I0318 12:26:00.652651 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563946-p6cjf"] Mar 18 12:26:00 crc kubenswrapper[4975]: I0318 12:26:00.661109 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:26:01 crc kubenswrapper[4975]: I0318 12:26:01.295458 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563946-p6cjf" event={"ID":"20335645-e96e-4255-b3c9-1f7da0921398","Type":"ContainerStarted","Data":"1561c4b6d5db794526e32f3c9c26c606ea958167957cc87e61ca8eee452da619"} Mar 18 12:26:02 crc kubenswrapper[4975]: I0318 12:26:02.301359 4975 generic.go:334] "Generic (PLEG): container finished" podID="20335645-e96e-4255-b3c9-1f7da0921398" containerID="ec430e8f9e4bb7f1a402c32a2ed89e973adb7d2b9982d7b8a9483181a25360ca" exitCode=0 Mar 18 12:26:02 crc kubenswrapper[4975]: I0318 12:26:02.301462 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563946-p6cjf" event={"ID":"20335645-e96e-4255-b3c9-1f7da0921398","Type":"ContainerDied","Data":"ec430e8f9e4bb7f1a402c32a2ed89e973adb7d2b9982d7b8a9483181a25360ca"} Mar 18 12:26:03 crc kubenswrapper[4975]: I0318 12:26:03.502198 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563946-p6cjf" Mar 18 12:26:03 crc kubenswrapper[4975]: I0318 12:26:03.696118 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shfv7\" (UniqueName: \"kubernetes.io/projected/20335645-e96e-4255-b3c9-1f7da0921398-kube-api-access-shfv7\") pod \"20335645-e96e-4255-b3c9-1f7da0921398\" (UID: \"20335645-e96e-4255-b3c9-1f7da0921398\") " Mar 18 12:26:03 crc kubenswrapper[4975]: I0318 12:26:03.704467 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20335645-e96e-4255-b3c9-1f7da0921398-kube-api-access-shfv7" (OuterVolumeSpecName: "kube-api-access-shfv7") pod "20335645-e96e-4255-b3c9-1f7da0921398" (UID: "20335645-e96e-4255-b3c9-1f7da0921398"). InnerVolumeSpecName "kube-api-access-shfv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:26:03 crc kubenswrapper[4975]: I0318 12:26:03.797408 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shfv7\" (UniqueName: \"kubernetes.io/projected/20335645-e96e-4255-b3c9-1f7da0921398-kube-api-access-shfv7\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:04 crc kubenswrapper[4975]: I0318 12:26:04.314583 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563946-p6cjf" event={"ID":"20335645-e96e-4255-b3c9-1f7da0921398","Type":"ContainerDied","Data":"1561c4b6d5db794526e32f3c9c26c606ea958167957cc87e61ca8eee452da619"} Mar 18 12:26:04 crc kubenswrapper[4975]: I0318 12:26:04.314628 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1561c4b6d5db794526e32f3c9c26c606ea958167957cc87e61ca8eee452da619" Mar 18 12:26:04 crc kubenswrapper[4975]: I0318 12:26:04.314658 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563946-p6cjf" Mar 18 12:26:04 crc kubenswrapper[4975]: I0318 12:26:04.570205 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563940-r8w7g"] Mar 18 12:26:04 crc kubenswrapper[4975]: I0318 12:26:04.575007 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563940-r8w7g"] Mar 18 12:26:05 crc kubenswrapper[4975]: I0318 12:26:05.023268 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edc54b26-3dff-4ab9-b02b-2468b13d95e2" path="/var/lib/kubelet/pods/edc54b26-3dff-4ab9-b02b-2468b13d95e2/volumes" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.361301 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-lcppz"] Mar 18 12:26:22 crc kubenswrapper[4975]: E0318 12:26:22.362258 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20335645-e96e-4255-b3c9-1f7da0921398" containerName="oc" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.362275 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="20335645-e96e-4255-b3c9-1f7da0921398" containerName="oc" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.362416 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="20335645-e96e-4255-b3c9-1f7da0921398" containerName="oc" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.362881 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-lcppz" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.365325 4975 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pdb5c" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.365541 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.365832 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.372555 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-vc9kt"] Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.375489 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vc9kt" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.376281 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-lcppz"] Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.379709 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-vc9kt"] Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.381200 4975 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-57fhv" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.416400 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-d8f4x"] Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.417082 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-d8f4x" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.420547 4975 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-l2j2c" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.432481 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-d8f4x"] Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.444173 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vkvv\" (UniqueName: \"kubernetes.io/projected/2b0c55e5-89aa-40ed-8232-e50bf89e63f5-kube-api-access-6vkvv\") pod \"cert-manager-cainjector-cf98fcc89-vc9kt\" (UID: \"2b0c55e5-89aa-40ed-8232-e50bf89e63f5\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-vc9kt" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.444231 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcds8\" (UniqueName: \"kubernetes.io/projected/3110dc21-85fd-41d1-b8ed-82ae5d760397-kube-api-access-bcds8\") pod \"cert-manager-webhook-687f57d79b-d8f4x\" (UID: \"3110dc21-85fd-41d1-b8ed-82ae5d760397\") " pod="cert-manager/cert-manager-webhook-687f57d79b-d8f4x" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.444343 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drwvs\" (UniqueName: \"kubernetes.io/projected/6b468fbd-2305-46c9-a021-5255a31d57ee-kube-api-access-drwvs\") pod \"cert-manager-858654f9db-lcppz\" (UID: \"6b468fbd-2305-46c9-a021-5255a31d57ee\") " pod="cert-manager/cert-manager-858654f9db-lcppz" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.545346 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vkvv\" (UniqueName: \"kubernetes.io/projected/2b0c55e5-89aa-40ed-8232-e50bf89e63f5-kube-api-access-6vkvv\") pod \"cert-manager-cainjector-cf98fcc89-vc9kt\" (UID: \"2b0c55e5-89aa-40ed-8232-e50bf89e63f5\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-vc9kt" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.545393 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcds8\" (UniqueName: \"kubernetes.io/projected/3110dc21-85fd-41d1-b8ed-82ae5d760397-kube-api-access-bcds8\") pod \"cert-manager-webhook-687f57d79b-d8f4x\" (UID: \"3110dc21-85fd-41d1-b8ed-82ae5d760397\") " pod="cert-manager/cert-manager-webhook-687f57d79b-d8f4x" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.545449 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drwvs\" (UniqueName: \"kubernetes.io/projected/6b468fbd-2305-46c9-a021-5255a31d57ee-kube-api-access-drwvs\") pod \"cert-manager-858654f9db-lcppz\" (UID: \"6b468fbd-2305-46c9-a021-5255a31d57ee\") " pod="cert-manager/cert-manager-858654f9db-lcppz" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.564258 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcds8\" (UniqueName: \"kubernetes.io/projected/3110dc21-85fd-41d1-b8ed-82ae5d760397-kube-api-access-bcds8\") pod \"cert-manager-webhook-687f57d79b-d8f4x\" (UID: \"3110dc21-85fd-41d1-b8ed-82ae5d760397\") " pod="cert-manager/cert-manager-webhook-687f57d79b-d8f4x" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.570331 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drwvs\" (UniqueName: \"kubernetes.io/projected/6b468fbd-2305-46c9-a021-5255a31d57ee-kube-api-access-drwvs\") pod \"cert-manager-858654f9db-lcppz\" (UID: \"6b468fbd-2305-46c9-a021-5255a31d57ee\") " pod="cert-manager/cert-manager-858654f9db-lcppz" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.580948 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vkvv\" (UniqueName: \"kubernetes.io/projected/2b0c55e5-89aa-40ed-8232-e50bf89e63f5-kube-api-access-6vkvv\") pod \"cert-manager-cainjector-cf98fcc89-vc9kt\" (UID: \"2b0c55e5-89aa-40ed-8232-e50bf89e63f5\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-vc9kt" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.678944 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-lcppz" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.696220 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vc9kt" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.734150 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-d8f4x" Mar 18 12:26:22 crc kubenswrapper[4975]: I0318 12:26:22.976597 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-d8f4x"] Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.131498 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-vc9kt"] Mar 18 12:26:23 crc kubenswrapper[4975]: W0318 12:26:23.134202 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b468fbd_2305_46c9_a021_5255a31d57ee.slice/crio-8366f077824e6eef5089c873cf41a851a000e6f5fa3d7aadecfb018a18111cb7 WatchSource:0}: Error finding container 8366f077824e6eef5089c873cf41a851a000e6f5fa3d7aadecfb018a18111cb7: Status 404 returned error can't find the container with id 8366f077824e6eef5089c873cf41a851a000e6f5fa3d7aadecfb018a18111cb7 Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.134632 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-lcppz"] Mar 18 12:26:23 crc kubenswrapper[4975]: W0318 12:26:23.137197 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b0c55e5_89aa_40ed_8232_e50bf89e63f5.slice/crio-8db6223d6ee6da1754335910c0bc98dbccaab2d6b18ab3bed92c5308252382c6 WatchSource:0}: Error finding container 8db6223d6ee6da1754335910c0bc98dbccaab2d6b18ab3bed92c5308252382c6: Status 404 returned error can't find the container with id 8db6223d6ee6da1754335910c0bc98dbccaab2d6b18ab3bed92c5308252382c6 Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.422366 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-d8f4x" event={"ID":"3110dc21-85fd-41d1-b8ed-82ae5d760397","Type":"ContainerStarted","Data":"e11bada236e13c57c18144cc0b282e92c68d8794d68c2e63de6a5447f9e2fec8"} Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.423695 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-lcppz" event={"ID":"6b468fbd-2305-46c9-a021-5255a31d57ee","Type":"ContainerStarted","Data":"8366f077824e6eef5089c873cf41a851a000e6f5fa3d7aadecfb018a18111cb7"} Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.425686 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vc9kt" event={"ID":"2b0c55e5-89aa-40ed-8232-e50bf89e63f5","Type":"ContainerStarted","Data":"8db6223d6ee6da1754335910c0bc98dbccaab2d6b18ab3bed92c5308252382c6"} Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.550408 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ntrck"] Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.554031 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.567797 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ntrck"] Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.659952 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306f66c7-c11f-455a-a5e3-f89260f4f4a1-catalog-content\") pod \"community-operators-ntrck\" (UID: \"306f66c7-c11f-455a-a5e3-f89260f4f4a1\") " pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.660342 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69bfv\" (UniqueName: \"kubernetes.io/projected/306f66c7-c11f-455a-a5e3-f89260f4f4a1-kube-api-access-69bfv\") pod \"community-operators-ntrck\" (UID: \"306f66c7-c11f-455a-a5e3-f89260f4f4a1\") " pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.660369 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306f66c7-c11f-455a-a5e3-f89260f4f4a1-utilities\") pod \"community-operators-ntrck\" (UID: \"306f66c7-c11f-455a-a5e3-f89260f4f4a1\") " pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.762945 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306f66c7-c11f-455a-a5e3-f89260f4f4a1-catalog-content\") pod \"community-operators-ntrck\" (UID: \"306f66c7-c11f-455a-a5e3-f89260f4f4a1\") " pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.763385 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306f66c7-c11f-455a-a5e3-f89260f4f4a1-catalog-content\") pod \"community-operators-ntrck\" (UID: \"306f66c7-c11f-455a-a5e3-f89260f4f4a1\") " pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.764006 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69bfv\" (UniqueName: \"kubernetes.io/projected/306f66c7-c11f-455a-a5e3-f89260f4f4a1-kube-api-access-69bfv\") pod \"community-operators-ntrck\" (UID: \"306f66c7-c11f-455a-a5e3-f89260f4f4a1\") " pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.764079 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306f66c7-c11f-455a-a5e3-f89260f4f4a1-utilities\") pod \"community-operators-ntrck\" (UID: \"306f66c7-c11f-455a-a5e3-f89260f4f4a1\") " pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.765143 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306f66c7-c11f-455a-a5e3-f89260f4f4a1-utilities\") pod \"community-operators-ntrck\" (UID: \"306f66c7-c11f-455a-a5e3-f89260f4f4a1\") " pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.784259 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69bfv\" (UniqueName: \"kubernetes.io/projected/306f66c7-c11f-455a-a5e3-f89260f4f4a1-kube-api-access-69bfv\") pod \"community-operators-ntrck\" (UID: \"306f66c7-c11f-455a-a5e3-f89260f4f4a1\") " pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:23 crc kubenswrapper[4975]: I0318 12:26:23.894751 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:24 crc kubenswrapper[4975]: I0318 12:26:24.247506 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ntrck"] Mar 18 12:26:24 crc kubenswrapper[4975]: I0318 12:26:24.432811 4975 generic.go:334] "Generic (PLEG): container finished" podID="306f66c7-c11f-455a-a5e3-f89260f4f4a1" containerID="434ee406b4ab57949f12b290cf0a28549ce7214b1f0bb0c3fcff832213bf3278" exitCode=0 Mar 18 12:26:24 crc kubenswrapper[4975]: I0318 12:26:24.432907 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntrck" event={"ID":"306f66c7-c11f-455a-a5e3-f89260f4f4a1","Type":"ContainerDied","Data":"434ee406b4ab57949f12b290cf0a28549ce7214b1f0bb0c3fcff832213bf3278"} Mar 18 12:26:24 crc kubenswrapper[4975]: I0318 12:26:24.432955 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntrck" event={"ID":"306f66c7-c11f-455a-a5e3-f89260f4f4a1","Type":"ContainerStarted","Data":"00f49b51d978ee0d54fadbc64e28f1a9c3df24510aed81a3f5028c8aca9e3886"} Mar 18 12:26:25 crc kubenswrapper[4975]: I0318 12:26:25.538435 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:26:25 crc kubenswrapper[4975]: I0318 12:26:25.538922 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:26:26 crc kubenswrapper[4975]: I0318 12:26:26.447712 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vc9kt" event={"ID":"2b0c55e5-89aa-40ed-8232-e50bf89e63f5","Type":"ContainerStarted","Data":"6f2de8a8c43687d6191325390ac47e64d90e0b8bc54880f6a9f869a8069553e3"} Mar 18 12:26:26 crc kubenswrapper[4975]: I0318 12:26:26.450540 4975 generic.go:334] "Generic (PLEG): container finished" podID="306f66c7-c11f-455a-a5e3-f89260f4f4a1" containerID="a6041ad67704f5189ad52577418ccd6235571488b47188cdbe9e4662a9cc5f73" exitCode=0 Mar 18 12:26:26 crc kubenswrapper[4975]: I0318 12:26:26.450612 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntrck" event={"ID":"306f66c7-c11f-455a-a5e3-f89260f4f4a1","Type":"ContainerDied","Data":"a6041ad67704f5189ad52577418ccd6235571488b47188cdbe9e4662a9cc5f73"} Mar 18 12:26:26 crc kubenswrapper[4975]: I0318 12:26:26.463643 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vc9kt" podStartSLOduration=2.043815903 podStartE2EDuration="4.46362687s" podCreationTimestamp="2026-03-18 12:26:22 +0000 UTC" firstStartedPulling="2026-03-18 12:26:23.141094117 +0000 UTC m=+968.855494696" lastFinishedPulling="2026-03-18 12:26:25.560905084 +0000 UTC m=+971.275305663" observedRunningTime="2026-03-18 12:26:26.462050237 +0000 UTC m=+972.176450816" watchObservedRunningTime="2026-03-18 12:26:26.46362687 +0000 UTC m=+972.178027449" Mar 18 12:26:27 crc kubenswrapper[4975]: I0318 12:26:27.457849 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-lcppz" event={"ID":"6b468fbd-2305-46c9-a021-5255a31d57ee","Type":"ContainerStarted","Data":"28245e386cc6cd7b9a81acc5ae7b318d32f3915ea675d0dbb2bc837235e1144f"} Mar 18 12:26:27 crc kubenswrapper[4975]: I0318 12:26:27.462184 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntrck" event={"ID":"306f66c7-c11f-455a-a5e3-f89260f4f4a1","Type":"ContainerStarted","Data":"15ab24ffb72649ba14457b9cd2fbc12f27b52db73d1596d15b412632bd79166c"} Mar 18 12:26:27 crc kubenswrapper[4975]: I0318 12:26:27.499836 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ntrck" podStartSLOduration=2.118828828 podStartE2EDuration="4.499813765s" podCreationTimestamp="2026-03-18 12:26:23 +0000 UTC" firstStartedPulling="2026-03-18 12:26:24.659363054 +0000 UTC m=+970.373763633" lastFinishedPulling="2026-03-18 12:26:27.040347971 +0000 UTC m=+972.754748570" observedRunningTime="2026-03-18 12:26:27.498745956 +0000 UTC m=+973.213146565" watchObservedRunningTime="2026-03-18 12:26:27.499813765 +0000 UTC m=+973.214214344" Mar 18 12:26:27 crc kubenswrapper[4975]: I0318 12:26:27.500960 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-lcppz" podStartSLOduration=1.969111881 podStartE2EDuration="5.500950077s" podCreationTimestamp="2026-03-18 12:26:22 +0000 UTC" firstStartedPulling="2026-03-18 12:26:23.136902403 +0000 UTC m=+968.851302982" lastFinishedPulling="2026-03-18 12:26:26.668740599 +0000 UTC m=+972.383141178" observedRunningTime="2026-03-18 12:26:27.475993744 +0000 UTC m=+973.190394323" watchObservedRunningTime="2026-03-18 12:26:27.500950077 +0000 UTC m=+973.215350656" Mar 18 12:26:29 crc kubenswrapper[4975]: I0318 12:26:29.479661 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-d8f4x" event={"ID":"3110dc21-85fd-41d1-b8ed-82ae5d760397","Type":"ContainerStarted","Data":"a340ff54ec7a6db34b7f45754d8b1db8f84d342c836b240da3c8384cd67d1592"} Mar 18 12:26:29 crc kubenswrapper[4975]: I0318 12:26:29.479970 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-d8f4x" Mar 18 12:26:29 crc kubenswrapper[4975]: I0318 12:26:29.499157 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-d8f4x" podStartSLOduration=1.208600779 podStartE2EDuration="7.499137641s" podCreationTimestamp="2026-03-18 12:26:22 +0000 UTC" firstStartedPulling="2026-03-18 12:26:22.988397117 +0000 UTC m=+968.702797696" lastFinishedPulling="2026-03-18 12:26:29.278933979 +0000 UTC m=+974.993334558" observedRunningTime="2026-03-18 12:26:29.497950549 +0000 UTC m=+975.212351138" watchObservedRunningTime="2026-03-18 12:26:29.499137641 +0000 UTC m=+975.213538220" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.426310 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-k8v6h"] Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.426736 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovn-controller" containerID="cri-o://de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c" gracePeriod=30 Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.426861 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="northd" containerID="cri-o://14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e" gracePeriod=30 Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.426930 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d" gracePeriod=30 Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.426970 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="kube-rbac-proxy-node" containerID="cri-o://a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337" gracePeriod=30 Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.427004 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovn-acl-logging" containerID="cri-o://9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731" gracePeriod=30 Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.427039 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="sbdb" containerID="cri-o://d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9" gracePeriod=30 Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.426749 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="nbdb" containerID="cri-o://3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2" gracePeriod=30 Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.468695 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" containerID="cri-o://0a61a1711ca92e59c183fdce16eac480765167b6069a15103cdb61de7f8812cd" gracePeriod=30 Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.779255 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovnkube-controller/3.log" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.781626 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovn-acl-logging/0.log" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.782141 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k8v6h_b0d0be67-e739-4dd7-abe4-3986a330a037/ovn-controller/0.log" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.782548 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.842543 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c2ns7"] Mar 18 12:26:32 crc kubenswrapper[4975]: E0318 12:26:32.842751 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.842762 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: E0318 12:26:32.842770 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="kubecfg-setup" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.842776 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="kubecfg-setup" Mar 18 12:26:32 crc kubenswrapper[4975]: E0318 12:26:32.842783 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovn-acl-logging" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.842789 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovn-acl-logging" Mar 18 12:26:32 crc kubenswrapper[4975]: E0318 12:26:32.842800 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovn-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.842805 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovn-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: E0318 12:26:32.842817 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="kube-rbac-proxy-node" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.842823 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="kube-rbac-proxy-node" Mar 18 12:26:32 crc kubenswrapper[4975]: E0318 12:26:32.842832 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="northd" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.842838 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="northd" Mar 18 12:26:32 crc kubenswrapper[4975]: E0318 12:26:32.842845 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.842851 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: E0318 12:26:32.842877 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="nbdb" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.842883 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="nbdb" Mar 18 12:26:32 crc kubenswrapper[4975]: E0318 12:26:32.842890 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="sbdb" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.842896 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="sbdb" Mar 18 12:26:32 crc kubenswrapper[4975]: E0318 12:26:32.842904 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.842909 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: E0318 12:26:32.842917 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.842922 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 12:26:32 crc kubenswrapper[4975]: E0318 12:26:32.842933 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.842938 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.843018 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.843027 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.843036 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovn-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.843043 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="sbdb" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.843049 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.843057 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="kube-rbac-proxy-node" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.843063 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.843073 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovn-acl-logging" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.843082 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="northd" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.843090 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="nbdb" Mar 18 12:26:32 crc kubenswrapper[4975]: E0318 12:26:32.843181 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.843187 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.843276 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.843293 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" containerName="ovnkube-controller" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.844730 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.897966 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-ovnkube-script-lib\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898021 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-766zq\" (UniqueName: \"kubernetes.io/projected/b0d0be67-e739-4dd7-abe4-3986a330a037-kube-api-access-766zq\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898051 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-kubelet\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898070 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-slash\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898086 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-cni-bin\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898104 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-systemd\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898167 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898204 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898167 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-slash" (OuterVolumeSpecName: "host-slash") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898519 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898562 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-var-lib-cni-networks-ovn-kubernetes\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898597 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-ovn\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898623 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-node-log\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898646 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-run-netns\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898674 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-env-overrides\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898695 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-etc-openvswitch\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898717 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-openvswitch\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898738 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0d0be67-e739-4dd7-abe4-3986a330a037-ovn-node-metrics-cert\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898674 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898695 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898721 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898767 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-log-socket\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898792 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-cni-netd\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898816 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-ovnkube-config\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898839 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-var-lib-openvswitch\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898859 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-systemd-units\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898910 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-run-ovn-kubernetes\") pod \"b0d0be67-e739-4dd7-abe4-3986a330a037\" (UID: \"b0d0be67-e739-4dd7-abe4-3986a330a037\") " Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899004 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-env-overrides\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899033 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-systemd-units\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899055 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-cni-netd\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899090 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-var-lib-openvswitch\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899115 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-node-log\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899141 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-run-ovn\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899158 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-run-ovn-kubernetes\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899177 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899204 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-ovnkube-script-lib\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899230 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-ovn-node-metrics-cert\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898739 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-node-log" (OuterVolumeSpecName: "node-log") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899251 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-log-socket\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898788 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898806 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.898832 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-log-socket" (OuterVolumeSpecName: "log-socket") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899011 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899245 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899297 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899306 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899334 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899351 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899384 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-run-openvswitch\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899443 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-cni-bin\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899518 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-ovnkube-config\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899542 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-run-netns\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899556 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hxj2\" (UniqueName: \"kubernetes.io/projected/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-kube-api-access-2hxj2\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899571 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-slash\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899584 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-etc-openvswitch\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899598 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-kubelet\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899610 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-run-systemd\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899692 4975 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899702 4975 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-slash\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899710 4975 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899718 4975 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899729 4975 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899738 4975 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-node-log\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899748 4975 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899759 4975 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899770 4975 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899780 4975 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899791 4975 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-log-socket\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899801 4975 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899809 4975 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899816 4975 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899824 4975 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899833 4975 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.899840 4975 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0d0be67-e739-4dd7-abe4-3986a330a037-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.941465 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d0be67-e739-4dd7-abe4-3986a330a037-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.942138 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d0be67-e739-4dd7-abe4-3986a330a037-kube-api-access-766zq" (OuterVolumeSpecName: "kube-api-access-766zq") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "kube-api-access-766zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:26:32 crc kubenswrapper[4975]: I0318 12:26:32.942638 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "b0d0be67-e739-4dd7-abe4-3986a330a037" (UID: "b0d0be67-e739-4dd7-abe4-3986a330a037"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.000506 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-var-lib-openvswitch\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.000582 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-node-log\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.000614 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-run-ovn\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.000630 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-run-ovn-kubernetes\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.000628 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-var-lib-openvswitch\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.000687 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.000648 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.000735 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-node-log\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.000748 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-ovnkube-script-lib\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.000752 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-run-ovn-kubernetes\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.000941 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-ovn-node-metrics-cert\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.000972 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-log-socket\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001025 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-run-openvswitch\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.000762 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-run-ovn\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001059 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-cni-bin\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001088 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-ovnkube-config\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001127 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-run-netns\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001132 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-cni-bin\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001136 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-run-openvswitch\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001150 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hxj2\" (UniqueName: \"kubernetes.io/projected/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-kube-api-access-2hxj2\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001090 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-log-socket\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001270 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-slash\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001295 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-etc-openvswitch\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001293 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-run-netns\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001350 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-slash\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001334 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-kubelet\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001315 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-kubelet\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001376 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-etc-openvswitch\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001407 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-run-systemd\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001462 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-env-overrides\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001494 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-systemd-units\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001523 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-cni-netd\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001537 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-run-systemd\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001575 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-systemd-units\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001608 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-ovnkube-script-lib\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001635 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-ovnkube-config\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001666 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-host-cni-netd\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001690 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-766zq\" (UniqueName: \"kubernetes.io/projected/b0d0be67-e739-4dd7-abe4-3986a330a037-kube-api-access-766zq\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001713 4975 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0d0be67-e739-4dd7-abe4-3986a330a037-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.001729 4975 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0d0be67-e739-4dd7-abe4-3986a330a037-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.002016 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-env-overrides\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.005626 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-ovn-node-metrics-cert\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.018167 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hxj2\" (UniqueName: \"kubernetes.io/projected/851626a9-c0b6-49cb-bdde-b4c9bf9fd549-kube-api-access-2hxj2\") pod \"ovnkube-node-c2ns7\" (UID: \"851626a9-c0b6-49cb-bdde-b4c9bf9fd549\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.159312 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:33 crc kubenswrapper[4975]: W0318 12:26:33.180061 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod851626a9_c0b6_49cb_bdde_b4c9bf9fd549.slice/crio-aac8c51f2ef18a379ef0caa4efb9cabb72f7ec24f39d590d9af94698b27f386e WatchSource:0}: Error finding container aac8c51f2ef18a379ef0caa4efb9cabb72f7ec24f39d590d9af94698b27f386e: Status 404 returned error can't find the container with id aac8c51f2ef18a379ef0caa4efb9cabb72f7ec24f39d590d9af94698b27f386e Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.256067 4975 scope.go:117] "RemoveContainer" containerID="d7d45fb4a14c0d0d6fa25693a5c2b13c06796bd1711cb01eb9e5617914f2f6a1" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.271315 4975 scope.go:117] "RemoveContainer" containerID="0a61a1711ca92e59c183fdce16eac480765167b6069a15103cdb61de7f8812cd" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.286411 4975 scope.go:117] "RemoveContainer" containerID="de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.299992 4975 scope.go:117] "RemoveContainer" containerID="a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.316290 4975 scope.go:117] "RemoveContainer" containerID="14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.329115 4975 scope.go:117] "RemoveContainer" containerID="11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.395061 4975 scope.go:117] "RemoveContainer" containerID="64bfccf1601bad083f03d9479444fc0849400928f6731cd8fe2da8a6e1921cd2" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.407216 4975 scope.go:117] "RemoveContainer" containerID="9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.431753 4975 scope.go:117] "RemoveContainer" containerID="4b60948cdf8aeeaba6d945a9b8b29367eb4aef2312b6fc54bc0322aa6690b760" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.456079 4975 scope.go:117] "RemoveContainer" containerID="d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.467102 4975 scope.go:117] "RemoveContainer" containerID="3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.478949 4975 scope.go:117] "RemoveContainer" containerID="d611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.503160 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerDied","Data":"0a61a1711ca92e59c183fdce16eac480765167b6069a15103cdb61de7f8812cd"} Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.503203 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerDied","Data":"d08d9043071e58e42c7cf84cd1fc2e2e4564f57d2917ead3b3666e6d1f5c0eb9"} Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.503215 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerDied","Data":"3ddc1d99818d5aff3f5b5380df5525dcacb29c9ef2f8ae05aca904beea8cb3a2"} Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.503226 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerDied","Data":"14981498d976da66ec7761c6a17fcae86beb89a58241cae6168c4f13b7e89d0e"} Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.503235 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerDied","Data":"11b2c1aea0a8d5f393503f5a8b9af44cd4e332eb479d40a8ef510efe268cee2d"} Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.503244 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerDied","Data":"a02263769d6798f7a9c7d87549a0befa3411a16639b2c46bf1d1056762488337"} Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.503255 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerDied","Data":"9a39eb6a8833f65d3b0d00acb894a9377d10863f860f7678c27d3c758cd2b731"} Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.503264 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerDied","Data":"de01e01ec6d32cf50684c57a091f5123228b8aa4df22a2a17b63824da197f41c"} Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.503274 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" event={"ID":"b0d0be67-e739-4dd7-abe4-3986a330a037","Type":"ContainerDied","Data":"cfd29c8d9ccd4721705fd54efca524b74b6bd018512e056aa0085552d847e87b"} Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.503241 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k8v6h" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.504258 4975 generic.go:334] "Generic (PLEG): container finished" podID="851626a9-c0b6-49cb-bdde-b4c9bf9fd549" containerID="a4dc89b0e9ecd5a91280b0d0dc1e8d011e16b1060aa0f65c748effbce2677cd0" exitCode=0 Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.504318 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" event={"ID":"851626a9-c0b6-49cb-bdde-b4c9bf9fd549","Type":"ContainerDied","Data":"a4dc89b0e9ecd5a91280b0d0dc1e8d011e16b1060aa0f65c748effbce2677cd0"} Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.504338 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" event={"ID":"851626a9-c0b6-49cb-bdde-b4c9bf9fd549","Type":"ContainerStarted","Data":"aac8c51f2ef18a379ef0caa4efb9cabb72f7ec24f39d590d9af94698b27f386e"} Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.507644 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9j7f_add6c8de-77cd-42e7-bf06-d2333b9392ea/kube-multus/2.log" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.508277 4975 generic.go:334] "Generic (PLEG): container finished" podID="add6c8de-77cd-42e7-bf06-d2333b9392ea" containerID="eec94e160170d4702c17967a65a0f9bb6acd952d34ba3dcb551e0afebc06d098" exitCode=2 Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.508303 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9j7f" event={"ID":"add6c8de-77cd-42e7-bf06-d2333b9392ea","Type":"ContainerDied","Data":"eec94e160170d4702c17967a65a0f9bb6acd952d34ba3dcb551e0afebc06d098"} Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.508339 4975 scope.go:117] "RemoveContainer" containerID="d611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.508774 4975 scope.go:117] "RemoveContainer" containerID="eec94e160170d4702c17967a65a0f9bb6acd952d34ba3dcb551e0afebc06d098" Mar 18 12:26:33 crc kubenswrapper[4975]: E0318 12:26:33.519514 4975 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_kube-multus_multus-n9j7f_openshift-multus_add6c8de-77cd-42e7-bf06-d2333b9392ea_1 in pod sandbox 6d2740b15dd1c34cc07cf6229c51aee82307e2c86c1d4a10282802758a012f31 from index: no such id: 'd611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d'" containerID="d611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.519561 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d"} err="rpc error: code = Unknown desc = failed to delete container k8s_kube-multus_multus-n9j7f_openshift-multus_add6c8de-77cd-42e7-bf06-d2333b9392ea_1 in pod sandbox 6d2740b15dd1c34cc07cf6229c51aee82307e2c86c1d4a10282802758a012f31 from index: no such id: 'd611e36e7ad2ddcf4a3f09210f1890afd7f8254f6b0f716ab6236e7c786d485d'" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.583924 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-k8v6h"] Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.590712 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-k8v6h"] Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.895449 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.895644 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:33 crc kubenswrapper[4975]: I0318 12:26:33.953472 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:34 crc kubenswrapper[4975]: I0318 12:26:34.519592 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" event={"ID":"851626a9-c0b6-49cb-bdde-b4c9bf9fd549","Type":"ContainerStarted","Data":"7cb1bf1dc1687c288a4c57456b4687f6211108e21baae0fb6b920469871cdc16"} Mar 18 12:26:34 crc kubenswrapper[4975]: I0318 12:26:34.519888 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" event={"ID":"851626a9-c0b6-49cb-bdde-b4c9bf9fd549","Type":"ContainerStarted","Data":"c854564656ecf0f6445d3b0d71f194e3a918cb887198a58ccde567cddca405f5"} Mar 18 12:26:34 crc kubenswrapper[4975]: I0318 12:26:34.519899 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" event={"ID":"851626a9-c0b6-49cb-bdde-b4c9bf9fd549","Type":"ContainerStarted","Data":"a40482dbd0c65c4fa4863e27a12f10ce0906894c9ea997a30cd3401cfde1c1ff"} Mar 18 12:26:34 crc kubenswrapper[4975]: I0318 12:26:34.519911 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" event={"ID":"851626a9-c0b6-49cb-bdde-b4c9bf9fd549","Type":"ContainerStarted","Data":"9932f092a18f9025d8d0aa612ff3c092abb48bc1a336ae0bedab56a86c997853"} Mar 18 12:26:34 crc kubenswrapper[4975]: I0318 12:26:34.519922 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" event={"ID":"851626a9-c0b6-49cb-bdde-b4c9bf9fd549","Type":"ContainerStarted","Data":"d1e6b066898f77952adadfdefc4277ed79e800a10fb2a0012c139a8bf8424491"} Mar 18 12:26:34 crc kubenswrapper[4975]: I0318 12:26:34.519932 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" event={"ID":"851626a9-c0b6-49cb-bdde-b4c9bf9fd549","Type":"ContainerStarted","Data":"1fdf2e4da613211cdc90d77324d15d44efe4e27081de30c2a14c3ba02c4e56e2"} Mar 18 12:26:34 crc kubenswrapper[4975]: I0318 12:26:34.521012 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9j7f_add6c8de-77cd-42e7-bf06-d2333b9392ea/kube-multus/2.log" Mar 18 12:26:34 crc kubenswrapper[4975]: I0318 12:26:34.521115 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9j7f" event={"ID":"add6c8de-77cd-42e7-bf06-d2333b9392ea","Type":"ContainerStarted","Data":"f7f11aec2c29f9365c2ed881e4db8d30ae7dbdfafb6b6342dcd81e123e198b7b"} Mar 18 12:26:34 crc kubenswrapper[4975]: I0318 12:26:34.562051 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:34 crc kubenswrapper[4975]: I0318 12:26:34.610129 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ntrck"] Mar 18 12:26:35 crc kubenswrapper[4975]: I0318 12:26:35.024157 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0d0be67-e739-4dd7-abe4-3986a330a037" path="/var/lib/kubelet/pods/b0d0be67-e739-4dd7-abe4-3986a330a037/volumes" Mar 18 12:26:36 crc kubenswrapper[4975]: I0318 12:26:36.534882 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" event={"ID":"851626a9-c0b6-49cb-bdde-b4c9bf9fd549","Type":"ContainerStarted","Data":"76121072af8a877f0d90fe8d4ada1bc68934174e94de972e61da3f28f3de5945"} Mar 18 12:26:36 crc kubenswrapper[4975]: I0318 12:26:36.535022 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ntrck" podUID="306f66c7-c11f-455a-a5e3-f89260f4f4a1" containerName="registry-server" containerID="cri-o://15ab24ffb72649ba14457b9cd2fbc12f27b52db73d1596d15b412632bd79166c" gracePeriod=2 Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.231789 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.355564 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306f66c7-c11f-455a-a5e3-f89260f4f4a1-catalog-content\") pod \"306f66c7-c11f-455a-a5e3-f89260f4f4a1\" (UID: \"306f66c7-c11f-455a-a5e3-f89260f4f4a1\") " Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.355617 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69bfv\" (UniqueName: \"kubernetes.io/projected/306f66c7-c11f-455a-a5e3-f89260f4f4a1-kube-api-access-69bfv\") pod \"306f66c7-c11f-455a-a5e3-f89260f4f4a1\" (UID: \"306f66c7-c11f-455a-a5e3-f89260f4f4a1\") " Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.355654 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306f66c7-c11f-455a-a5e3-f89260f4f4a1-utilities\") pod \"306f66c7-c11f-455a-a5e3-f89260f4f4a1\" (UID: \"306f66c7-c11f-455a-a5e3-f89260f4f4a1\") " Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.356742 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306f66c7-c11f-455a-a5e3-f89260f4f4a1-utilities" (OuterVolumeSpecName: "utilities") pod "306f66c7-c11f-455a-a5e3-f89260f4f4a1" (UID: "306f66c7-c11f-455a-a5e3-f89260f4f4a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.360824 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306f66c7-c11f-455a-a5e3-f89260f4f4a1-kube-api-access-69bfv" (OuterVolumeSpecName: "kube-api-access-69bfv") pod "306f66c7-c11f-455a-a5e3-f89260f4f4a1" (UID: "306f66c7-c11f-455a-a5e3-f89260f4f4a1"). InnerVolumeSpecName "kube-api-access-69bfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.411444 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306f66c7-c11f-455a-a5e3-f89260f4f4a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "306f66c7-c11f-455a-a5e3-f89260f4f4a1" (UID: "306f66c7-c11f-455a-a5e3-f89260f4f4a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.457418 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306f66c7-c11f-455a-a5e3-f89260f4f4a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.457827 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306f66c7-c11f-455a-a5e3-f89260f4f4a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.457949 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69bfv\" (UniqueName: \"kubernetes.io/projected/306f66c7-c11f-455a-a5e3-f89260f4f4a1-kube-api-access-69bfv\") on node \"crc\" DevicePath \"\"" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.542141 4975 generic.go:334] "Generic (PLEG): container finished" podID="306f66c7-c11f-455a-a5e3-f89260f4f4a1" containerID="15ab24ffb72649ba14457b9cd2fbc12f27b52db73d1596d15b412632bd79166c" exitCode=0 Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.542202 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntrck" event={"ID":"306f66c7-c11f-455a-a5e3-f89260f4f4a1","Type":"ContainerDied","Data":"15ab24ffb72649ba14457b9cd2fbc12f27b52db73d1596d15b412632bd79166c"} Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.542966 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntrck" event={"ID":"306f66c7-c11f-455a-a5e3-f89260f4f4a1","Type":"ContainerDied","Data":"00f49b51d978ee0d54fadbc64e28f1a9c3df24510aed81a3f5028c8aca9e3886"} Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.542990 4975 scope.go:117] "RemoveContainer" containerID="15ab24ffb72649ba14457b9cd2fbc12f27b52db73d1596d15b412632bd79166c" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.542264 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntrck" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.563999 4975 scope.go:117] "RemoveContainer" containerID="a6041ad67704f5189ad52577418ccd6235571488b47188cdbe9e4662a9cc5f73" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.575642 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ntrck"] Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.586557 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ntrck"] Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.597298 4975 scope.go:117] "RemoveContainer" containerID="434ee406b4ab57949f12b290cf0a28549ce7214b1f0bb0c3fcff832213bf3278" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.615074 4975 scope.go:117] "RemoveContainer" containerID="15ab24ffb72649ba14457b9cd2fbc12f27b52db73d1596d15b412632bd79166c" Mar 18 12:26:37 crc kubenswrapper[4975]: E0318 12:26:37.615644 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15ab24ffb72649ba14457b9cd2fbc12f27b52db73d1596d15b412632bd79166c\": container with ID starting with 15ab24ffb72649ba14457b9cd2fbc12f27b52db73d1596d15b412632bd79166c not found: ID does not exist" containerID="15ab24ffb72649ba14457b9cd2fbc12f27b52db73d1596d15b412632bd79166c" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.615676 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15ab24ffb72649ba14457b9cd2fbc12f27b52db73d1596d15b412632bd79166c"} err="failed to get container status \"15ab24ffb72649ba14457b9cd2fbc12f27b52db73d1596d15b412632bd79166c\": rpc error: code = NotFound desc = could not find container \"15ab24ffb72649ba14457b9cd2fbc12f27b52db73d1596d15b412632bd79166c\": container with ID starting with 15ab24ffb72649ba14457b9cd2fbc12f27b52db73d1596d15b412632bd79166c not found: ID does not exist" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.615700 4975 scope.go:117] "RemoveContainer" containerID="a6041ad67704f5189ad52577418ccd6235571488b47188cdbe9e4662a9cc5f73" Mar 18 12:26:37 crc kubenswrapper[4975]: E0318 12:26:37.616098 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6041ad67704f5189ad52577418ccd6235571488b47188cdbe9e4662a9cc5f73\": container with ID starting with a6041ad67704f5189ad52577418ccd6235571488b47188cdbe9e4662a9cc5f73 not found: ID does not exist" containerID="a6041ad67704f5189ad52577418ccd6235571488b47188cdbe9e4662a9cc5f73" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.616268 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6041ad67704f5189ad52577418ccd6235571488b47188cdbe9e4662a9cc5f73"} err="failed to get container status \"a6041ad67704f5189ad52577418ccd6235571488b47188cdbe9e4662a9cc5f73\": rpc error: code = NotFound desc = could not find container \"a6041ad67704f5189ad52577418ccd6235571488b47188cdbe9e4662a9cc5f73\": container with ID starting with a6041ad67704f5189ad52577418ccd6235571488b47188cdbe9e4662a9cc5f73 not found: ID does not exist" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.616361 4975 scope.go:117] "RemoveContainer" containerID="434ee406b4ab57949f12b290cf0a28549ce7214b1f0bb0c3fcff832213bf3278" Mar 18 12:26:37 crc kubenswrapper[4975]: E0318 12:26:37.616664 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"434ee406b4ab57949f12b290cf0a28549ce7214b1f0bb0c3fcff832213bf3278\": container with ID starting with 434ee406b4ab57949f12b290cf0a28549ce7214b1f0bb0c3fcff832213bf3278 not found: ID does not exist" containerID="434ee406b4ab57949f12b290cf0a28549ce7214b1f0bb0c3fcff832213bf3278" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.616685 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"434ee406b4ab57949f12b290cf0a28549ce7214b1f0bb0c3fcff832213bf3278"} err="failed to get container status \"434ee406b4ab57949f12b290cf0a28549ce7214b1f0bb0c3fcff832213bf3278\": rpc error: code = NotFound desc = could not find container \"434ee406b4ab57949f12b290cf0a28549ce7214b1f0bb0c3fcff832213bf3278\": container with ID starting with 434ee406b4ab57949f12b290cf0a28549ce7214b1f0bb0c3fcff832213bf3278 not found: ID does not exist" Mar 18 12:26:37 crc kubenswrapper[4975]: I0318 12:26:37.738047 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-d8f4x" Mar 18 12:26:39 crc kubenswrapper[4975]: I0318 12:26:39.025679 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="306f66c7-c11f-455a-a5e3-f89260f4f4a1" path="/var/lib/kubelet/pods/306f66c7-c11f-455a-a5e3-f89260f4f4a1/volumes" Mar 18 12:26:39 crc kubenswrapper[4975]: I0318 12:26:39.557823 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" event={"ID":"851626a9-c0b6-49cb-bdde-b4c9bf9fd549","Type":"ContainerStarted","Data":"d71745ff999af67069f67734bbf149c3f8e6744797a517deac907164def58a9e"} Mar 18 12:26:39 crc kubenswrapper[4975]: I0318 12:26:39.558162 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:39 crc kubenswrapper[4975]: I0318 12:26:39.558224 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:39 crc kubenswrapper[4975]: I0318 12:26:39.558234 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:39 crc kubenswrapper[4975]: I0318 12:26:39.583566 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:39 crc kubenswrapper[4975]: I0318 12:26:39.584160 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" podStartSLOduration=7.584142072 podStartE2EDuration="7.584142072s" podCreationTimestamp="2026-03-18 12:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:26:39.583624768 +0000 UTC m=+985.298025347" watchObservedRunningTime="2026-03-18 12:26:39.584142072 +0000 UTC m=+985.298542651" Mar 18 12:26:39 crc kubenswrapper[4975]: I0318 12:26:39.584345 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:26:55 crc kubenswrapper[4975]: I0318 12:26:55.539080 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:26:55 crc kubenswrapper[4975]: I0318 12:26:55.539739 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:26:55 crc kubenswrapper[4975]: I0318 12:26:55.539795 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:26:55 crc kubenswrapper[4975]: I0318 12:26:55.651441 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"142b541a7de05fd46269c85cd3392d764eb097aeaa954b82530c1118b45a05b8"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:26:55 crc kubenswrapper[4975]: I0318 12:26:55.651539 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://142b541a7de05fd46269c85cd3392d764eb097aeaa954b82530c1118b45a05b8" gracePeriod=600 Mar 18 12:26:55 crc kubenswrapper[4975]: I0318 12:26:55.834277 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5q8dg"] Mar 18 12:26:55 crc kubenswrapper[4975]: E0318 12:26:55.834696 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306f66c7-c11f-455a-a5e3-f89260f4f4a1" containerName="extract-utilities" Mar 18 12:26:55 crc kubenswrapper[4975]: I0318 12:26:55.834799 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="306f66c7-c11f-455a-a5e3-f89260f4f4a1" containerName="extract-utilities" Mar 18 12:26:55 crc kubenswrapper[4975]: E0318 12:26:55.834901 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306f66c7-c11f-455a-a5e3-f89260f4f4a1" containerName="registry-server" Mar 18 12:26:55 crc kubenswrapper[4975]: I0318 12:26:55.834967 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="306f66c7-c11f-455a-a5e3-f89260f4f4a1" containerName="registry-server" Mar 18 12:26:55 crc kubenswrapper[4975]: E0318 12:26:55.835028 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306f66c7-c11f-455a-a5e3-f89260f4f4a1" containerName="extract-content" Mar 18 12:26:55 crc kubenswrapper[4975]: I0318 12:26:55.835076 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="306f66c7-c11f-455a-a5e3-f89260f4f4a1" containerName="extract-content" Mar 18 12:26:55 crc kubenswrapper[4975]: I0318 12:26:55.835225 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="306f66c7-c11f-455a-a5e3-f89260f4f4a1" containerName="registry-server" Mar 18 12:26:55 crc kubenswrapper[4975]: I0318 12:26:55.836012 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:26:55 crc kubenswrapper[4975]: I0318 12:26:55.840710 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5q8dg"] Mar 18 12:26:55 crc kubenswrapper[4975]: I0318 12:26:55.983918 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90b4ad88-9275-466c-b56e-ddc8e192fb67-utilities\") pod \"redhat-operators-5q8dg\" (UID: \"90b4ad88-9275-466c-b56e-ddc8e192fb67\") " pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:26:55 crc kubenswrapper[4975]: I0318 12:26:55.984335 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrxbh\" (UniqueName: \"kubernetes.io/projected/90b4ad88-9275-466c-b56e-ddc8e192fb67-kube-api-access-vrxbh\") pod \"redhat-operators-5q8dg\" (UID: \"90b4ad88-9275-466c-b56e-ddc8e192fb67\") " pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:26:55 crc kubenswrapper[4975]: I0318 12:26:55.984395 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90b4ad88-9275-466c-b56e-ddc8e192fb67-catalog-content\") pod \"redhat-operators-5q8dg\" (UID: \"90b4ad88-9275-466c-b56e-ddc8e192fb67\") " pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:26:56 crc kubenswrapper[4975]: I0318 12:26:56.085241 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90b4ad88-9275-466c-b56e-ddc8e192fb67-utilities\") pod \"redhat-operators-5q8dg\" (UID: \"90b4ad88-9275-466c-b56e-ddc8e192fb67\") " pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:26:56 crc kubenswrapper[4975]: I0318 12:26:56.085329 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrxbh\" (UniqueName: \"kubernetes.io/projected/90b4ad88-9275-466c-b56e-ddc8e192fb67-kube-api-access-vrxbh\") pod \"redhat-operators-5q8dg\" (UID: \"90b4ad88-9275-466c-b56e-ddc8e192fb67\") " pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:26:56 crc kubenswrapper[4975]: I0318 12:26:56.085378 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90b4ad88-9275-466c-b56e-ddc8e192fb67-catalog-content\") pod \"redhat-operators-5q8dg\" (UID: \"90b4ad88-9275-466c-b56e-ddc8e192fb67\") " pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:26:56 crc kubenswrapper[4975]: I0318 12:26:56.085897 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90b4ad88-9275-466c-b56e-ddc8e192fb67-utilities\") pod \"redhat-operators-5q8dg\" (UID: \"90b4ad88-9275-466c-b56e-ddc8e192fb67\") " pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:26:56 crc kubenswrapper[4975]: I0318 12:26:56.085909 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90b4ad88-9275-466c-b56e-ddc8e192fb67-catalog-content\") pod \"redhat-operators-5q8dg\" (UID: \"90b4ad88-9275-466c-b56e-ddc8e192fb67\") " pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:26:56 crc kubenswrapper[4975]: I0318 12:26:56.121260 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrxbh\" (UniqueName: \"kubernetes.io/projected/90b4ad88-9275-466c-b56e-ddc8e192fb67-kube-api-access-vrxbh\") pod \"redhat-operators-5q8dg\" (UID: \"90b4ad88-9275-466c-b56e-ddc8e192fb67\") " pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:26:56 crc kubenswrapper[4975]: I0318 12:26:56.159810 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:26:56 crc kubenswrapper[4975]: I0318 12:26:56.571784 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5q8dg"] Mar 18 12:26:56 crc kubenswrapper[4975]: I0318 12:26:56.656345 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q8dg" event={"ID":"90b4ad88-9275-466c-b56e-ddc8e192fb67","Type":"ContainerStarted","Data":"f994c40c3de5b7693446cdc5ac07dbf198afdceeaee59f43814b735beea2ed2e"} Mar 18 12:26:56 crc kubenswrapper[4975]: I0318 12:26:56.662166 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="142b541a7de05fd46269c85cd3392d764eb097aeaa954b82530c1118b45a05b8" exitCode=0 Mar 18 12:26:56 crc kubenswrapper[4975]: I0318 12:26:56.662213 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"142b541a7de05fd46269c85cd3392d764eb097aeaa954b82530c1118b45a05b8"} Mar 18 12:26:56 crc kubenswrapper[4975]: I0318 12:26:56.662240 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"b3ad49f3300a39909733b143700abc28ad83ea2ad2f5fc6a9b69e95819adb98f"} Mar 18 12:26:56 crc kubenswrapper[4975]: I0318 12:26:56.662261 4975 scope.go:117] "RemoveContainer" containerID="652d3462a6e10a47d996fdbb6d3a4cc821e9e3b750eb8fdb8c2f0cb2935587d4" Mar 18 12:26:57 crc kubenswrapper[4975]: I0318 12:26:57.669617 4975 generic.go:334] "Generic (PLEG): container finished" podID="90b4ad88-9275-466c-b56e-ddc8e192fb67" containerID="f1202af3daf611227f493b038ecefb5cae109e67e386a5a59b6eb992e4d614ce" exitCode=0 Mar 18 12:26:57 crc kubenswrapper[4975]: I0318 12:26:57.669724 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q8dg" event={"ID":"90b4ad88-9275-466c-b56e-ddc8e192fb67","Type":"ContainerDied","Data":"f1202af3daf611227f493b038ecefb5cae109e67e386a5a59b6eb992e4d614ce"} Mar 18 12:26:58 crc kubenswrapper[4975]: I0318 12:26:58.681836 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q8dg" event={"ID":"90b4ad88-9275-466c-b56e-ddc8e192fb67","Type":"ContainerStarted","Data":"5cbf3aebda0dc46e61f76fa6d253b43f05499579c9ca8e5be40b691524be391c"} Mar 18 12:27:00 crc kubenswrapper[4975]: I0318 12:27:00.692677 4975 generic.go:334] "Generic (PLEG): container finished" podID="90b4ad88-9275-466c-b56e-ddc8e192fb67" containerID="5cbf3aebda0dc46e61f76fa6d253b43f05499579c9ca8e5be40b691524be391c" exitCode=0 Mar 18 12:27:00 crc kubenswrapper[4975]: I0318 12:27:00.692759 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q8dg" event={"ID":"90b4ad88-9275-466c-b56e-ddc8e192fb67","Type":"ContainerDied","Data":"5cbf3aebda0dc46e61f76fa6d253b43f05499579c9ca8e5be40b691524be391c"} Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.013470 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p9b9f"] Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.019634 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.038322 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p9b9f"] Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.142436 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec77de6e-b5ed-410a-ae5d-a09c1305779a-utilities\") pod \"certified-operators-p9b9f\" (UID: \"ec77de6e-b5ed-410a-ae5d-a09c1305779a\") " pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.142480 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6qrq\" (UniqueName: \"kubernetes.io/projected/ec77de6e-b5ed-410a-ae5d-a09c1305779a-kube-api-access-m6qrq\") pod \"certified-operators-p9b9f\" (UID: \"ec77de6e-b5ed-410a-ae5d-a09c1305779a\") " pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.142568 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec77de6e-b5ed-410a-ae5d-a09c1305779a-catalog-content\") pod \"certified-operators-p9b9f\" (UID: \"ec77de6e-b5ed-410a-ae5d-a09c1305779a\") " pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.244231 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec77de6e-b5ed-410a-ae5d-a09c1305779a-utilities\") pod \"certified-operators-p9b9f\" (UID: \"ec77de6e-b5ed-410a-ae5d-a09c1305779a\") " pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.244276 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6qrq\" (UniqueName: \"kubernetes.io/projected/ec77de6e-b5ed-410a-ae5d-a09c1305779a-kube-api-access-m6qrq\") pod \"certified-operators-p9b9f\" (UID: \"ec77de6e-b5ed-410a-ae5d-a09c1305779a\") " pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.244321 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec77de6e-b5ed-410a-ae5d-a09c1305779a-catalog-content\") pod \"certified-operators-p9b9f\" (UID: \"ec77de6e-b5ed-410a-ae5d-a09c1305779a\") " pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.245009 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec77de6e-b5ed-410a-ae5d-a09c1305779a-utilities\") pod \"certified-operators-p9b9f\" (UID: \"ec77de6e-b5ed-410a-ae5d-a09c1305779a\") " pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.245036 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec77de6e-b5ed-410a-ae5d-a09c1305779a-catalog-content\") pod \"certified-operators-p9b9f\" (UID: \"ec77de6e-b5ed-410a-ae5d-a09c1305779a\") " pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.263892 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6qrq\" (UniqueName: \"kubernetes.io/projected/ec77de6e-b5ed-410a-ae5d-a09c1305779a-kube-api-access-m6qrq\") pod \"certified-operators-p9b9f\" (UID: \"ec77de6e-b5ed-410a-ae5d-a09c1305779a\") " pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.348420 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.704204 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q8dg" event={"ID":"90b4ad88-9275-466c-b56e-ddc8e192fb67","Type":"ContainerStarted","Data":"09d98a4a362d7d12d8e7b69d4866fd4debe7681eb75cd5b1d78a8630c950864f"} Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.734213 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5q8dg" podStartSLOduration=3.215042596 podStartE2EDuration="6.734193334s" podCreationTimestamp="2026-03-18 12:26:55 +0000 UTC" firstStartedPulling="2026-03-18 12:26:57.672819968 +0000 UTC m=+1003.387220547" lastFinishedPulling="2026-03-18 12:27:01.191970716 +0000 UTC m=+1006.906371285" observedRunningTime="2026-03-18 12:27:01.730150173 +0000 UTC m=+1007.444550772" watchObservedRunningTime="2026-03-18 12:27:01.734193334 +0000 UTC m=+1007.448593923" Mar 18 12:27:01 crc kubenswrapper[4975]: I0318 12:27:01.857162 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p9b9f"] Mar 18 12:27:02 crc kubenswrapper[4975]: I0318 12:27:02.709810 4975 generic.go:334] "Generic (PLEG): container finished" podID="ec77de6e-b5ed-410a-ae5d-a09c1305779a" containerID="5fe71a568ee463784bf03d807011618ce4d257c2e070a038be79fbf297c82794" exitCode=0 Mar 18 12:27:02 crc kubenswrapper[4975]: I0318 12:27:02.709888 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9b9f" event={"ID":"ec77de6e-b5ed-410a-ae5d-a09c1305779a","Type":"ContainerDied","Data":"5fe71a568ee463784bf03d807011618ce4d257c2e070a038be79fbf297c82794"} Mar 18 12:27:02 crc kubenswrapper[4975]: I0318 12:27:02.709921 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9b9f" event={"ID":"ec77de6e-b5ed-410a-ae5d-a09c1305779a","Type":"ContainerStarted","Data":"99cbdc66bac767e38275df8591947cde57a2a6af405b6bce67ce3554f9b8f251"} Mar 18 12:27:03 crc kubenswrapper[4975]: I0318 12:27:03.184437 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c2ns7" Mar 18 12:27:03 crc kubenswrapper[4975]: I0318 12:27:03.411375 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rk9h9"] Mar 18 12:27:03 crc kubenswrapper[4975]: I0318 12:27:03.412433 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:03 crc kubenswrapper[4975]: I0318 12:27:03.423672 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rk9h9"] Mar 18 12:27:03 crc kubenswrapper[4975]: I0318 12:27:03.570090 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cjt2\" (UniqueName: \"kubernetes.io/projected/dc38035e-6f91-459b-81ce-f93977bc5b02-kube-api-access-7cjt2\") pod \"redhat-marketplace-rk9h9\" (UID: \"dc38035e-6f91-459b-81ce-f93977bc5b02\") " pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:03 crc kubenswrapper[4975]: I0318 12:27:03.570138 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc38035e-6f91-459b-81ce-f93977bc5b02-utilities\") pod \"redhat-marketplace-rk9h9\" (UID: \"dc38035e-6f91-459b-81ce-f93977bc5b02\") " pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:03 crc kubenswrapper[4975]: I0318 12:27:03.570168 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc38035e-6f91-459b-81ce-f93977bc5b02-catalog-content\") pod \"redhat-marketplace-rk9h9\" (UID: \"dc38035e-6f91-459b-81ce-f93977bc5b02\") " pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:03 crc kubenswrapper[4975]: I0318 12:27:03.671677 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cjt2\" (UniqueName: \"kubernetes.io/projected/dc38035e-6f91-459b-81ce-f93977bc5b02-kube-api-access-7cjt2\") pod \"redhat-marketplace-rk9h9\" (UID: \"dc38035e-6f91-459b-81ce-f93977bc5b02\") " pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:03 crc kubenswrapper[4975]: I0318 12:27:03.671744 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc38035e-6f91-459b-81ce-f93977bc5b02-utilities\") pod \"redhat-marketplace-rk9h9\" (UID: \"dc38035e-6f91-459b-81ce-f93977bc5b02\") " pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:03 crc kubenswrapper[4975]: I0318 12:27:03.671767 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc38035e-6f91-459b-81ce-f93977bc5b02-catalog-content\") pod \"redhat-marketplace-rk9h9\" (UID: \"dc38035e-6f91-459b-81ce-f93977bc5b02\") " pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:03 crc kubenswrapper[4975]: I0318 12:27:03.672584 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc38035e-6f91-459b-81ce-f93977bc5b02-catalog-content\") pod \"redhat-marketplace-rk9h9\" (UID: \"dc38035e-6f91-459b-81ce-f93977bc5b02\") " pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:03 crc kubenswrapper[4975]: I0318 12:27:03.672750 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc38035e-6f91-459b-81ce-f93977bc5b02-utilities\") pod \"redhat-marketplace-rk9h9\" (UID: \"dc38035e-6f91-459b-81ce-f93977bc5b02\") " pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:03 crc kubenswrapper[4975]: I0318 12:27:03.691897 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cjt2\" (UniqueName: \"kubernetes.io/projected/dc38035e-6f91-459b-81ce-f93977bc5b02-kube-api-access-7cjt2\") pod \"redhat-marketplace-rk9h9\" (UID: \"dc38035e-6f91-459b-81ce-f93977bc5b02\") " pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:03 crc kubenswrapper[4975]: I0318 12:27:03.741324 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:03 crc kubenswrapper[4975]: I0318 12:27:03.750665 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9b9f" event={"ID":"ec77de6e-b5ed-410a-ae5d-a09c1305779a","Type":"ContainerStarted","Data":"7ba1f7730383b2b2809d77b6961b2eac80cb5080dc3ce105c57f8f79484a2a5b"} Mar 18 12:27:04 crc kubenswrapper[4975]: I0318 12:27:04.529846 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rk9h9"] Mar 18 12:27:04 crc kubenswrapper[4975]: W0318 12:27:04.537945 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc38035e_6f91_459b_81ce_f93977bc5b02.slice/crio-1df1ee42eea18760dd5b76c29517f125c1c8b24e823e7fa465ba3c93c22a4ee5 WatchSource:0}: Error finding container 1df1ee42eea18760dd5b76c29517f125c1c8b24e823e7fa465ba3c93c22a4ee5: Status 404 returned error can't find the container with id 1df1ee42eea18760dd5b76c29517f125c1c8b24e823e7fa465ba3c93c22a4ee5 Mar 18 12:27:04 crc kubenswrapper[4975]: I0318 12:27:04.756841 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk9h9" event={"ID":"dc38035e-6f91-459b-81ce-f93977bc5b02","Type":"ContainerStarted","Data":"1df1ee42eea18760dd5b76c29517f125c1c8b24e823e7fa465ba3c93c22a4ee5"} Mar 18 12:27:05 crc kubenswrapper[4975]: I0318 12:27:05.767924 4975 generic.go:334] "Generic (PLEG): container finished" podID="dc38035e-6f91-459b-81ce-f93977bc5b02" containerID="8dadc6cff5f530e06a86862dc7ca63d2975ce0fc0a0a28407d2c379daedde87f" exitCode=0 Mar 18 12:27:05 crc kubenswrapper[4975]: I0318 12:27:05.767995 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk9h9" event={"ID":"dc38035e-6f91-459b-81ce-f93977bc5b02","Type":"ContainerDied","Data":"8dadc6cff5f530e06a86862dc7ca63d2975ce0fc0a0a28407d2c379daedde87f"} Mar 18 12:27:05 crc kubenswrapper[4975]: I0318 12:27:05.770393 4975 generic.go:334] "Generic (PLEG): container finished" podID="ec77de6e-b5ed-410a-ae5d-a09c1305779a" containerID="7ba1f7730383b2b2809d77b6961b2eac80cb5080dc3ce105c57f8f79484a2a5b" exitCode=0 Mar 18 12:27:05 crc kubenswrapper[4975]: I0318 12:27:05.770454 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9b9f" event={"ID":"ec77de6e-b5ed-410a-ae5d-a09c1305779a","Type":"ContainerDied","Data":"7ba1f7730383b2b2809d77b6961b2eac80cb5080dc3ce105c57f8f79484a2a5b"} Mar 18 12:27:06 crc kubenswrapper[4975]: I0318 12:27:06.160690 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:27:06 crc kubenswrapper[4975]: I0318 12:27:06.160730 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:27:06 crc kubenswrapper[4975]: I0318 12:27:06.776198 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk9h9" event={"ID":"dc38035e-6f91-459b-81ce-f93977bc5b02","Type":"ContainerStarted","Data":"85e7de6da02aefd07fd12b8d0e0bce47a2a84c8653524db75d96d66d076dd087"} Mar 18 12:27:06 crc kubenswrapper[4975]: I0318 12:27:06.780388 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9b9f" event={"ID":"ec77de6e-b5ed-410a-ae5d-a09c1305779a","Type":"ContainerStarted","Data":"1cf28aeb8bbf8bf555ec5f634d29b81f7206c9ad09948482473aecc2908e8185"} Mar 18 12:27:06 crc kubenswrapper[4975]: I0318 12:27:06.818461 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p9b9f" podStartSLOduration=3.346569896 podStartE2EDuration="6.8184452s" podCreationTimestamp="2026-03-18 12:27:00 +0000 UTC" firstStartedPulling="2026-03-18 12:27:02.711465468 +0000 UTC m=+1008.425866047" lastFinishedPulling="2026-03-18 12:27:06.183340772 +0000 UTC m=+1011.897741351" observedRunningTime="2026-03-18 12:27:06.816427595 +0000 UTC m=+1012.530828184" watchObservedRunningTime="2026-03-18 12:27:06.8184452 +0000 UTC m=+1012.532845789" Mar 18 12:27:07 crc kubenswrapper[4975]: I0318 12:27:07.221156 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5q8dg" podUID="90b4ad88-9275-466c-b56e-ddc8e192fb67" containerName="registry-server" probeResult="failure" output=< Mar 18 12:27:07 crc kubenswrapper[4975]: timeout: failed to connect service ":50051" within 1s Mar 18 12:27:07 crc kubenswrapper[4975]: > Mar 18 12:27:07 crc kubenswrapper[4975]: I0318 12:27:07.799981 4975 generic.go:334] "Generic (PLEG): container finished" podID="dc38035e-6f91-459b-81ce-f93977bc5b02" containerID="85e7de6da02aefd07fd12b8d0e0bce47a2a84c8653524db75d96d66d076dd087" exitCode=0 Mar 18 12:27:07 crc kubenswrapper[4975]: I0318 12:27:07.800042 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk9h9" event={"ID":"dc38035e-6f91-459b-81ce-f93977bc5b02","Type":"ContainerDied","Data":"85e7de6da02aefd07fd12b8d0e0bce47a2a84c8653524db75d96d66d076dd087"} Mar 18 12:27:08 crc kubenswrapper[4975]: I0318 12:27:08.807212 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk9h9" event={"ID":"dc38035e-6f91-459b-81ce-f93977bc5b02","Type":"ContainerStarted","Data":"83710952b835ce7aee9754770f1dc33fe811e2ad3b12b6608f5e2afadb5daac2"} Mar 18 12:27:08 crc kubenswrapper[4975]: I0318 12:27:08.829375 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rk9h9" podStartSLOduration=3.209762606 podStartE2EDuration="5.829355013s" podCreationTimestamp="2026-03-18 12:27:03 +0000 UTC" firstStartedPulling="2026-03-18 12:27:05.77074026 +0000 UTC m=+1011.485140839" lastFinishedPulling="2026-03-18 12:27:08.390332667 +0000 UTC m=+1014.104733246" observedRunningTime="2026-03-18 12:27:08.823357209 +0000 UTC m=+1014.537757808" watchObservedRunningTime="2026-03-18 12:27:08.829355013 +0000 UTC m=+1014.543755602" Mar 18 12:27:11 crc kubenswrapper[4975]: I0318 12:27:11.348922 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:11 crc kubenswrapper[4975]: I0318 12:27:11.349629 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:11 crc kubenswrapper[4975]: I0318 12:27:11.394293 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:11 crc kubenswrapper[4975]: I0318 12:27:11.861073 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:12 crc kubenswrapper[4975]: I0318 12:27:12.605844 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p9b9f"] Mar 18 12:27:13 crc kubenswrapper[4975]: I0318 12:27:13.742338 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:13 crc kubenswrapper[4975]: I0318 12:27:13.742625 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:13 crc kubenswrapper[4975]: I0318 12:27:13.784548 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:13 crc kubenswrapper[4975]: I0318 12:27:13.833171 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p9b9f" podUID="ec77de6e-b5ed-410a-ae5d-a09c1305779a" containerName="registry-server" containerID="cri-o://1cf28aeb8bbf8bf555ec5f634d29b81f7206c9ad09948482473aecc2908e8185" gracePeriod=2 Mar 18 12:27:13 crc kubenswrapper[4975]: I0318 12:27:13.885238 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.674879 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.717665 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6qrq\" (UniqueName: \"kubernetes.io/projected/ec77de6e-b5ed-410a-ae5d-a09c1305779a-kube-api-access-m6qrq\") pod \"ec77de6e-b5ed-410a-ae5d-a09c1305779a\" (UID: \"ec77de6e-b5ed-410a-ae5d-a09c1305779a\") " Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.717777 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec77de6e-b5ed-410a-ae5d-a09c1305779a-utilities\") pod \"ec77de6e-b5ed-410a-ae5d-a09c1305779a\" (UID: \"ec77de6e-b5ed-410a-ae5d-a09c1305779a\") " Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.717812 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec77de6e-b5ed-410a-ae5d-a09c1305779a-catalog-content\") pod \"ec77de6e-b5ed-410a-ae5d-a09c1305779a\" (UID: \"ec77de6e-b5ed-410a-ae5d-a09c1305779a\") " Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.718896 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec77de6e-b5ed-410a-ae5d-a09c1305779a-utilities" (OuterVolumeSpecName: "utilities") pod "ec77de6e-b5ed-410a-ae5d-a09c1305779a" (UID: "ec77de6e-b5ed-410a-ae5d-a09c1305779a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.723572 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec77de6e-b5ed-410a-ae5d-a09c1305779a-kube-api-access-m6qrq" (OuterVolumeSpecName: "kube-api-access-m6qrq") pod "ec77de6e-b5ed-410a-ae5d-a09c1305779a" (UID: "ec77de6e-b5ed-410a-ae5d-a09c1305779a"). InnerVolumeSpecName "kube-api-access-m6qrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.773407 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec77de6e-b5ed-410a-ae5d-a09c1305779a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec77de6e-b5ed-410a-ae5d-a09c1305779a" (UID: "ec77de6e-b5ed-410a-ae5d-a09c1305779a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.819081 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6qrq\" (UniqueName: \"kubernetes.io/projected/ec77de6e-b5ed-410a-ae5d-a09c1305779a-kube-api-access-m6qrq\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.819110 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec77de6e-b5ed-410a-ae5d-a09c1305779a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.819119 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec77de6e-b5ed-410a-ae5d-a09c1305779a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.840498 4975 generic.go:334] "Generic (PLEG): container finished" podID="ec77de6e-b5ed-410a-ae5d-a09c1305779a" containerID="1cf28aeb8bbf8bf555ec5f634d29b81f7206c9ad09948482473aecc2908e8185" exitCode=0 Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.840582 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p9b9f" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.840609 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9b9f" event={"ID":"ec77de6e-b5ed-410a-ae5d-a09c1305779a","Type":"ContainerDied","Data":"1cf28aeb8bbf8bf555ec5f634d29b81f7206c9ad09948482473aecc2908e8185"} Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.840659 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9b9f" event={"ID":"ec77de6e-b5ed-410a-ae5d-a09c1305779a","Type":"ContainerDied","Data":"99cbdc66bac767e38275df8591947cde57a2a6af405b6bce67ce3554f9b8f251"} Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.840676 4975 scope.go:117] "RemoveContainer" containerID="1cf28aeb8bbf8bf555ec5f634d29b81f7206c9ad09948482473aecc2908e8185" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.856542 4975 scope.go:117] "RemoveContainer" containerID="7ba1f7730383b2b2809d77b6961b2eac80cb5080dc3ce105c57f8f79484a2a5b" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.874199 4975 scope.go:117] "RemoveContainer" containerID="5fe71a568ee463784bf03d807011618ce4d257c2e070a038be79fbf297c82794" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.902378 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p9b9f"] Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.908268 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p9b9f"] Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.911891 4975 scope.go:117] "RemoveContainer" containerID="1cf28aeb8bbf8bf555ec5f634d29b81f7206c9ad09948482473aecc2908e8185" Mar 18 12:27:14 crc kubenswrapper[4975]: E0318 12:27:14.912323 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf28aeb8bbf8bf555ec5f634d29b81f7206c9ad09948482473aecc2908e8185\": container with ID starting with 1cf28aeb8bbf8bf555ec5f634d29b81f7206c9ad09948482473aecc2908e8185 not found: ID does not exist" containerID="1cf28aeb8bbf8bf555ec5f634d29b81f7206c9ad09948482473aecc2908e8185" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.912364 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf28aeb8bbf8bf555ec5f634d29b81f7206c9ad09948482473aecc2908e8185"} err="failed to get container status \"1cf28aeb8bbf8bf555ec5f634d29b81f7206c9ad09948482473aecc2908e8185\": rpc error: code = NotFound desc = could not find container \"1cf28aeb8bbf8bf555ec5f634d29b81f7206c9ad09948482473aecc2908e8185\": container with ID starting with 1cf28aeb8bbf8bf555ec5f634d29b81f7206c9ad09948482473aecc2908e8185 not found: ID does not exist" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.912391 4975 scope.go:117] "RemoveContainer" containerID="7ba1f7730383b2b2809d77b6961b2eac80cb5080dc3ce105c57f8f79484a2a5b" Mar 18 12:27:14 crc kubenswrapper[4975]: E0318 12:27:14.912893 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ba1f7730383b2b2809d77b6961b2eac80cb5080dc3ce105c57f8f79484a2a5b\": container with ID starting with 7ba1f7730383b2b2809d77b6961b2eac80cb5080dc3ce105c57f8f79484a2a5b not found: ID does not exist" containerID="7ba1f7730383b2b2809d77b6961b2eac80cb5080dc3ce105c57f8f79484a2a5b" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.912936 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ba1f7730383b2b2809d77b6961b2eac80cb5080dc3ce105c57f8f79484a2a5b"} err="failed to get container status \"7ba1f7730383b2b2809d77b6961b2eac80cb5080dc3ce105c57f8f79484a2a5b\": rpc error: code = NotFound desc = could not find container \"7ba1f7730383b2b2809d77b6961b2eac80cb5080dc3ce105c57f8f79484a2a5b\": container with ID starting with 7ba1f7730383b2b2809d77b6961b2eac80cb5080dc3ce105c57f8f79484a2a5b not found: ID does not exist" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.912964 4975 scope.go:117] "RemoveContainer" containerID="5fe71a568ee463784bf03d807011618ce4d257c2e070a038be79fbf297c82794" Mar 18 12:27:14 crc kubenswrapper[4975]: E0318 12:27:14.913290 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fe71a568ee463784bf03d807011618ce4d257c2e070a038be79fbf297c82794\": container with ID starting with 5fe71a568ee463784bf03d807011618ce4d257c2e070a038be79fbf297c82794 not found: ID does not exist" containerID="5fe71a568ee463784bf03d807011618ce4d257c2e070a038be79fbf297c82794" Mar 18 12:27:14 crc kubenswrapper[4975]: I0318 12:27:14.913313 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fe71a568ee463784bf03d807011618ce4d257c2e070a038be79fbf297c82794"} err="failed to get container status \"5fe71a568ee463784bf03d807011618ce4d257c2e070a038be79fbf297c82794\": rpc error: code = NotFound desc = could not find container \"5fe71a568ee463784bf03d807011618ce4d257c2e070a038be79fbf297c82794\": container with ID starting with 5fe71a568ee463784bf03d807011618ce4d257c2e070a038be79fbf297c82794 not found: ID does not exist" Mar 18 12:27:15 crc kubenswrapper[4975]: I0318 12:27:15.025783 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec77de6e-b5ed-410a-ae5d-a09c1305779a" path="/var/lib/kubelet/pods/ec77de6e-b5ed-410a-ae5d-a09c1305779a/volumes" Mar 18 12:27:15 crc kubenswrapper[4975]: I0318 12:27:15.209446 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rk9h9"] Mar 18 12:27:15 crc kubenswrapper[4975]: I0318 12:27:15.848694 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rk9h9" podUID="dc38035e-6f91-459b-81ce-f93977bc5b02" containerName="registry-server" containerID="cri-o://83710952b835ce7aee9754770f1dc33fe811e2ad3b12b6608f5e2afadb5daac2" gracePeriod=2 Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.219634 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.259181 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.276453 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.337425 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cjt2\" (UniqueName: \"kubernetes.io/projected/dc38035e-6f91-459b-81ce-f93977bc5b02-kube-api-access-7cjt2\") pod \"dc38035e-6f91-459b-81ce-f93977bc5b02\" (UID: \"dc38035e-6f91-459b-81ce-f93977bc5b02\") " Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.337509 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc38035e-6f91-459b-81ce-f93977bc5b02-catalog-content\") pod \"dc38035e-6f91-459b-81ce-f93977bc5b02\" (UID: \"dc38035e-6f91-459b-81ce-f93977bc5b02\") " Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.337536 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc38035e-6f91-459b-81ce-f93977bc5b02-utilities\") pod \"dc38035e-6f91-459b-81ce-f93977bc5b02\" (UID: \"dc38035e-6f91-459b-81ce-f93977bc5b02\") " Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.338558 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc38035e-6f91-459b-81ce-f93977bc5b02-utilities" (OuterVolumeSpecName: "utilities") pod "dc38035e-6f91-459b-81ce-f93977bc5b02" (UID: "dc38035e-6f91-459b-81ce-f93977bc5b02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.343988 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc38035e-6f91-459b-81ce-f93977bc5b02-kube-api-access-7cjt2" (OuterVolumeSpecName: "kube-api-access-7cjt2") pod "dc38035e-6f91-459b-81ce-f93977bc5b02" (UID: "dc38035e-6f91-459b-81ce-f93977bc5b02"). InnerVolumeSpecName "kube-api-access-7cjt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.363496 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc38035e-6f91-459b-81ce-f93977bc5b02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc38035e-6f91-459b-81ce-f93977bc5b02" (UID: "dc38035e-6f91-459b-81ce-f93977bc5b02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.439637 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cjt2\" (UniqueName: \"kubernetes.io/projected/dc38035e-6f91-459b-81ce-f93977bc5b02-kube-api-access-7cjt2\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.439680 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc38035e-6f91-459b-81ce-f93977bc5b02-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.439688 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc38035e-6f91-459b-81ce-f93977bc5b02-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.864889 4975 generic.go:334] "Generic (PLEG): container finished" podID="dc38035e-6f91-459b-81ce-f93977bc5b02" containerID="83710952b835ce7aee9754770f1dc33fe811e2ad3b12b6608f5e2afadb5daac2" exitCode=0 Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.864961 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rk9h9" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.864963 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk9h9" event={"ID":"dc38035e-6f91-459b-81ce-f93977bc5b02","Type":"ContainerDied","Data":"83710952b835ce7aee9754770f1dc33fe811e2ad3b12b6608f5e2afadb5daac2"} Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.865009 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rk9h9" event={"ID":"dc38035e-6f91-459b-81ce-f93977bc5b02","Type":"ContainerDied","Data":"1df1ee42eea18760dd5b76c29517f125c1c8b24e823e7fa465ba3c93c22a4ee5"} Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.865029 4975 scope.go:117] "RemoveContainer" containerID="83710952b835ce7aee9754770f1dc33fe811e2ad3b12b6608f5e2afadb5daac2" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.887204 4975 scope.go:117] "RemoveContainer" containerID="85e7de6da02aefd07fd12b8d0e0bce47a2a84c8653524db75d96d66d076dd087" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.892737 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rk9h9"] Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.896948 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rk9h9"] Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.920701 4975 scope.go:117] "RemoveContainer" containerID="8dadc6cff5f530e06a86862dc7ca63d2975ce0fc0a0a28407d2c379daedde87f" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.933290 4975 scope.go:117] "RemoveContainer" containerID="83710952b835ce7aee9754770f1dc33fe811e2ad3b12b6608f5e2afadb5daac2" Mar 18 12:27:16 crc kubenswrapper[4975]: E0318 12:27:16.933673 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83710952b835ce7aee9754770f1dc33fe811e2ad3b12b6608f5e2afadb5daac2\": container with ID starting with 83710952b835ce7aee9754770f1dc33fe811e2ad3b12b6608f5e2afadb5daac2 not found: ID does not exist" containerID="83710952b835ce7aee9754770f1dc33fe811e2ad3b12b6608f5e2afadb5daac2" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.933716 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83710952b835ce7aee9754770f1dc33fe811e2ad3b12b6608f5e2afadb5daac2"} err="failed to get container status \"83710952b835ce7aee9754770f1dc33fe811e2ad3b12b6608f5e2afadb5daac2\": rpc error: code = NotFound desc = could not find container \"83710952b835ce7aee9754770f1dc33fe811e2ad3b12b6608f5e2afadb5daac2\": container with ID starting with 83710952b835ce7aee9754770f1dc33fe811e2ad3b12b6608f5e2afadb5daac2 not found: ID does not exist" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.933779 4975 scope.go:117] "RemoveContainer" containerID="85e7de6da02aefd07fd12b8d0e0bce47a2a84c8653524db75d96d66d076dd087" Mar 18 12:27:16 crc kubenswrapper[4975]: E0318 12:27:16.934112 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e7de6da02aefd07fd12b8d0e0bce47a2a84c8653524db75d96d66d076dd087\": container with ID starting with 85e7de6da02aefd07fd12b8d0e0bce47a2a84c8653524db75d96d66d076dd087 not found: ID does not exist" containerID="85e7de6da02aefd07fd12b8d0e0bce47a2a84c8653524db75d96d66d076dd087" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.934132 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e7de6da02aefd07fd12b8d0e0bce47a2a84c8653524db75d96d66d076dd087"} err="failed to get container status \"85e7de6da02aefd07fd12b8d0e0bce47a2a84c8653524db75d96d66d076dd087\": rpc error: code = NotFound desc = could not find container \"85e7de6da02aefd07fd12b8d0e0bce47a2a84c8653524db75d96d66d076dd087\": container with ID starting with 85e7de6da02aefd07fd12b8d0e0bce47a2a84c8653524db75d96d66d076dd087 not found: ID does not exist" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.934152 4975 scope.go:117] "RemoveContainer" containerID="8dadc6cff5f530e06a86862dc7ca63d2975ce0fc0a0a28407d2c379daedde87f" Mar 18 12:27:16 crc kubenswrapper[4975]: E0318 12:27:16.934386 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dadc6cff5f530e06a86862dc7ca63d2975ce0fc0a0a28407d2c379daedde87f\": container with ID starting with 8dadc6cff5f530e06a86862dc7ca63d2975ce0fc0a0a28407d2c379daedde87f not found: ID does not exist" containerID="8dadc6cff5f530e06a86862dc7ca63d2975ce0fc0a0a28407d2c379daedde87f" Mar 18 12:27:16 crc kubenswrapper[4975]: I0318 12:27:16.934440 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dadc6cff5f530e06a86862dc7ca63d2975ce0fc0a0a28407d2c379daedde87f"} err="failed to get container status \"8dadc6cff5f530e06a86862dc7ca63d2975ce0fc0a0a28407d2c379daedde87f\": rpc error: code = NotFound desc = could not find container \"8dadc6cff5f530e06a86862dc7ca63d2975ce0fc0a0a28407d2c379daedde87f\": container with ID starting with 8dadc6cff5f530e06a86862dc7ca63d2975ce0fc0a0a28407d2c379daedde87f not found: ID does not exist" Mar 18 12:27:17 crc kubenswrapper[4975]: I0318 12:27:17.024549 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc38035e-6f91-459b-81ce-f93977bc5b02" path="/var/lib/kubelet/pods/dc38035e-6f91-459b-81ce-f93977bc5b02/volumes" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.007276 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5q8dg"] Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.007521 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5q8dg" podUID="90b4ad88-9275-466c-b56e-ddc8e192fb67" containerName="registry-server" containerID="cri-o://09d98a4a362d7d12d8e7b69d4866fd4debe7681eb75cd5b1d78a8630c950864f" gracePeriod=2 Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.342602 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.376456 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90b4ad88-9275-466c-b56e-ddc8e192fb67-utilities\") pod \"90b4ad88-9275-466c-b56e-ddc8e192fb67\" (UID: \"90b4ad88-9275-466c-b56e-ddc8e192fb67\") " Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.376497 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90b4ad88-9275-466c-b56e-ddc8e192fb67-catalog-content\") pod \"90b4ad88-9275-466c-b56e-ddc8e192fb67\" (UID: \"90b4ad88-9275-466c-b56e-ddc8e192fb67\") " Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.376528 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrxbh\" (UniqueName: \"kubernetes.io/projected/90b4ad88-9275-466c-b56e-ddc8e192fb67-kube-api-access-vrxbh\") pod \"90b4ad88-9275-466c-b56e-ddc8e192fb67\" (UID: \"90b4ad88-9275-466c-b56e-ddc8e192fb67\") " Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.377724 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b4ad88-9275-466c-b56e-ddc8e192fb67-utilities" (OuterVolumeSpecName: "utilities") pod "90b4ad88-9275-466c-b56e-ddc8e192fb67" (UID: "90b4ad88-9275-466c-b56e-ddc8e192fb67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.382153 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b4ad88-9275-466c-b56e-ddc8e192fb67-kube-api-access-vrxbh" (OuterVolumeSpecName: "kube-api-access-vrxbh") pod "90b4ad88-9275-466c-b56e-ddc8e192fb67" (UID: "90b4ad88-9275-466c-b56e-ddc8e192fb67"). InnerVolumeSpecName "kube-api-access-vrxbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.478175 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90b4ad88-9275-466c-b56e-ddc8e192fb67-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.478219 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrxbh\" (UniqueName: \"kubernetes.io/projected/90b4ad88-9275-466c-b56e-ddc8e192fb67-kube-api-access-vrxbh\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.502938 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b4ad88-9275-466c-b56e-ddc8e192fb67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90b4ad88-9275-466c-b56e-ddc8e192fb67" (UID: "90b4ad88-9275-466c-b56e-ddc8e192fb67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.579964 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90b4ad88-9275-466c-b56e-ddc8e192fb67-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.890071 4975 generic.go:334] "Generic (PLEG): container finished" podID="90b4ad88-9275-466c-b56e-ddc8e192fb67" containerID="09d98a4a362d7d12d8e7b69d4866fd4debe7681eb75cd5b1d78a8630c950864f" exitCode=0 Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.890176 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q8dg" event={"ID":"90b4ad88-9275-466c-b56e-ddc8e192fb67","Type":"ContainerDied","Data":"09d98a4a362d7d12d8e7b69d4866fd4debe7681eb75cd5b1d78a8630c950864f"} Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.890154 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5q8dg" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.890233 4975 scope.go:117] "RemoveContainer" containerID="09d98a4a362d7d12d8e7b69d4866fd4debe7681eb75cd5b1d78a8630c950864f" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.890220 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q8dg" event={"ID":"90b4ad88-9275-466c-b56e-ddc8e192fb67","Type":"ContainerDied","Data":"f994c40c3de5b7693446cdc5ac07dbf198afdceeaee59f43814b735beea2ed2e"} Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.910140 4975 scope.go:117] "RemoveContainer" containerID="5cbf3aebda0dc46e61f76fa6d253b43f05499579c9ca8e5be40b691524be391c" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.925460 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5q8dg"] Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.929267 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5q8dg"] Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.931884 4975 scope.go:117] "RemoveContainer" containerID="f1202af3daf611227f493b038ecefb5cae109e67e386a5a59b6eb992e4d614ce" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.958056 4975 scope.go:117] "RemoveContainer" containerID="09d98a4a362d7d12d8e7b69d4866fd4debe7681eb75cd5b1d78a8630c950864f" Mar 18 12:27:19 crc kubenswrapper[4975]: E0318 12:27:19.958478 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09d98a4a362d7d12d8e7b69d4866fd4debe7681eb75cd5b1d78a8630c950864f\": container with ID starting with 09d98a4a362d7d12d8e7b69d4866fd4debe7681eb75cd5b1d78a8630c950864f not found: ID does not exist" containerID="09d98a4a362d7d12d8e7b69d4866fd4debe7681eb75cd5b1d78a8630c950864f" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.958520 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09d98a4a362d7d12d8e7b69d4866fd4debe7681eb75cd5b1d78a8630c950864f"} err="failed to get container status \"09d98a4a362d7d12d8e7b69d4866fd4debe7681eb75cd5b1d78a8630c950864f\": rpc error: code = NotFound desc = could not find container \"09d98a4a362d7d12d8e7b69d4866fd4debe7681eb75cd5b1d78a8630c950864f\": container with ID starting with 09d98a4a362d7d12d8e7b69d4866fd4debe7681eb75cd5b1d78a8630c950864f not found: ID does not exist" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.958549 4975 scope.go:117] "RemoveContainer" containerID="5cbf3aebda0dc46e61f76fa6d253b43f05499579c9ca8e5be40b691524be391c" Mar 18 12:27:19 crc kubenswrapper[4975]: E0318 12:27:19.958922 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cbf3aebda0dc46e61f76fa6d253b43f05499579c9ca8e5be40b691524be391c\": container with ID starting with 5cbf3aebda0dc46e61f76fa6d253b43f05499579c9ca8e5be40b691524be391c not found: ID does not exist" containerID="5cbf3aebda0dc46e61f76fa6d253b43f05499579c9ca8e5be40b691524be391c" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.958945 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cbf3aebda0dc46e61f76fa6d253b43f05499579c9ca8e5be40b691524be391c"} err="failed to get container status \"5cbf3aebda0dc46e61f76fa6d253b43f05499579c9ca8e5be40b691524be391c\": rpc error: code = NotFound desc = could not find container \"5cbf3aebda0dc46e61f76fa6d253b43f05499579c9ca8e5be40b691524be391c\": container with ID starting with 5cbf3aebda0dc46e61f76fa6d253b43f05499579c9ca8e5be40b691524be391c not found: ID does not exist" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.958982 4975 scope.go:117] "RemoveContainer" containerID="f1202af3daf611227f493b038ecefb5cae109e67e386a5a59b6eb992e4d614ce" Mar 18 12:27:19 crc kubenswrapper[4975]: E0318 12:27:19.959293 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1202af3daf611227f493b038ecefb5cae109e67e386a5a59b6eb992e4d614ce\": container with ID starting with f1202af3daf611227f493b038ecefb5cae109e67e386a5a59b6eb992e4d614ce not found: ID does not exist" containerID="f1202af3daf611227f493b038ecefb5cae109e67e386a5a59b6eb992e4d614ce" Mar 18 12:27:19 crc kubenswrapper[4975]: I0318 12:27:19.959335 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1202af3daf611227f493b038ecefb5cae109e67e386a5a59b6eb992e4d614ce"} err="failed to get container status \"f1202af3daf611227f493b038ecefb5cae109e67e386a5a59b6eb992e4d614ce\": rpc error: code = NotFound desc = could not find container \"f1202af3daf611227f493b038ecefb5cae109e67e386a5a59b6eb992e4d614ce\": container with ID starting with f1202af3daf611227f493b038ecefb5cae109e67e386a5a59b6eb992e4d614ce not found: ID does not exist" Mar 18 12:27:21 crc kubenswrapper[4975]: I0318 12:27:21.023117 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90b4ad88-9275-466c-b56e-ddc8e192fb67" path="/var/lib/kubelet/pods/90b4ad88-9275-466c-b56e-ddc8e192fb67/volumes" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.055815 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2"] Mar 18 12:27:23 crc kubenswrapper[4975]: E0318 12:27:23.056073 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b4ad88-9275-466c-b56e-ddc8e192fb67" containerName="registry-server" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.056090 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b4ad88-9275-466c-b56e-ddc8e192fb67" containerName="registry-server" Mar 18 12:27:23 crc kubenswrapper[4975]: E0318 12:27:23.056104 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc38035e-6f91-459b-81ce-f93977bc5b02" containerName="registry-server" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.056112 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc38035e-6f91-459b-81ce-f93977bc5b02" containerName="registry-server" Mar 18 12:27:23 crc kubenswrapper[4975]: E0318 12:27:23.056126 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc38035e-6f91-459b-81ce-f93977bc5b02" containerName="extract-utilities" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.056134 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc38035e-6f91-459b-81ce-f93977bc5b02" containerName="extract-utilities" Mar 18 12:27:23 crc kubenswrapper[4975]: E0318 12:27:23.056149 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec77de6e-b5ed-410a-ae5d-a09c1305779a" containerName="registry-server" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.056155 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec77de6e-b5ed-410a-ae5d-a09c1305779a" containerName="registry-server" Mar 18 12:27:23 crc kubenswrapper[4975]: E0318 12:27:23.056168 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc38035e-6f91-459b-81ce-f93977bc5b02" containerName="extract-content" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.056175 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc38035e-6f91-459b-81ce-f93977bc5b02" containerName="extract-content" Mar 18 12:27:23 crc kubenswrapper[4975]: E0318 12:27:23.056188 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec77de6e-b5ed-410a-ae5d-a09c1305779a" containerName="extract-content" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.056197 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec77de6e-b5ed-410a-ae5d-a09c1305779a" containerName="extract-content" Mar 18 12:27:23 crc kubenswrapper[4975]: E0318 12:27:23.056208 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec77de6e-b5ed-410a-ae5d-a09c1305779a" containerName="extract-utilities" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.056216 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec77de6e-b5ed-410a-ae5d-a09c1305779a" containerName="extract-utilities" Mar 18 12:27:23 crc kubenswrapper[4975]: E0318 12:27:23.056228 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b4ad88-9275-466c-b56e-ddc8e192fb67" containerName="extract-utilities" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.056234 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b4ad88-9275-466c-b56e-ddc8e192fb67" containerName="extract-utilities" Mar 18 12:27:23 crc kubenswrapper[4975]: E0318 12:27:23.056244 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b4ad88-9275-466c-b56e-ddc8e192fb67" containerName="extract-content" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.056249 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b4ad88-9275-466c-b56e-ddc8e192fb67" containerName="extract-content" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.056347 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec77de6e-b5ed-410a-ae5d-a09c1305779a" containerName="registry-server" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.056366 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="90b4ad88-9275-466c-b56e-ddc8e192fb67" containerName="registry-server" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.056382 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc38035e-6f91-459b-81ce-f93977bc5b02" containerName="registry-server" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.057231 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.059548 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.082400 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2"] Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.121454 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/128d4df9-9451-466a-a545-b916760c3c45-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2\" (UID: \"128d4df9-9451-466a-a545-b916760c3c45\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.121711 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/128d4df9-9451-466a-a545-b916760c3c45-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2\" (UID: \"128d4df9-9451-466a-a545-b916760c3c45\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.121780 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7x2n\" (UniqueName: \"kubernetes.io/projected/128d4df9-9451-466a-a545-b916760c3c45-kube-api-access-r7x2n\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2\" (UID: \"128d4df9-9451-466a-a545-b916760c3c45\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.223602 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7x2n\" (UniqueName: \"kubernetes.io/projected/128d4df9-9451-466a-a545-b916760c3c45-kube-api-access-r7x2n\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2\" (UID: \"128d4df9-9451-466a-a545-b916760c3c45\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.223748 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/128d4df9-9451-466a-a545-b916760c3c45-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2\" (UID: \"128d4df9-9451-466a-a545-b916760c3c45\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.223814 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/128d4df9-9451-466a-a545-b916760c3c45-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2\" (UID: \"128d4df9-9451-466a-a545-b916760c3c45\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.224404 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/128d4df9-9451-466a-a545-b916760c3c45-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2\" (UID: \"128d4df9-9451-466a-a545-b916760c3c45\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.224511 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/128d4df9-9451-466a-a545-b916760c3c45-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2\" (UID: \"128d4df9-9451-466a-a545-b916760c3c45\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.247551 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7x2n\" (UniqueName: \"kubernetes.io/projected/128d4df9-9451-466a-a545-b916760c3c45-kube-api-access-r7x2n\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2\" (UID: \"128d4df9-9451-466a-a545-b916760c3c45\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.382417 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.559194 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2"] Mar 18 12:27:23 crc kubenswrapper[4975]: W0318 12:27:23.564598 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod128d4df9_9451_466a_a545_b916760c3c45.slice/crio-8c7aebbdcbc1af02c323a8576db94a9099412d3203d77bbfdb75f595cd984223 WatchSource:0}: Error finding container 8c7aebbdcbc1af02c323a8576db94a9099412d3203d77bbfdb75f595cd984223: Status 404 returned error can't find the container with id 8c7aebbdcbc1af02c323a8576db94a9099412d3203d77bbfdb75f595cd984223 Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.916590 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" event={"ID":"128d4df9-9451-466a-a545-b916760c3c45","Type":"ContainerStarted","Data":"07bc6d063455944ea0a6b1ef2b0bb562dac9b2dd90246c27fc0ab250ce439817"} Mar 18 12:27:23 crc kubenswrapper[4975]: I0318 12:27:23.916667 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" event={"ID":"128d4df9-9451-466a-a545-b916760c3c45","Type":"ContainerStarted","Data":"8c7aebbdcbc1af02c323a8576db94a9099412d3203d77bbfdb75f595cd984223"} Mar 18 12:27:24 crc kubenswrapper[4975]: I0318 12:27:24.923048 4975 generic.go:334] "Generic (PLEG): container finished" podID="128d4df9-9451-466a-a545-b916760c3c45" containerID="07bc6d063455944ea0a6b1ef2b0bb562dac9b2dd90246c27fc0ab250ce439817" exitCode=0 Mar 18 12:27:24 crc kubenswrapper[4975]: I0318 12:27:24.923127 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" event={"ID":"128d4df9-9451-466a-a545-b916760c3c45","Type":"ContainerDied","Data":"07bc6d063455944ea0a6b1ef2b0bb562dac9b2dd90246c27fc0ab250ce439817"} Mar 18 12:27:26 crc kubenswrapper[4975]: I0318 12:27:26.934700 4975 generic.go:334] "Generic (PLEG): container finished" podID="128d4df9-9451-466a-a545-b916760c3c45" containerID="81199d177bf34cbf27798b070987fc374b9e4a85e5dd80f4ecb0828d09ac4644" exitCode=0 Mar 18 12:27:26 crc kubenswrapper[4975]: I0318 12:27:26.934804 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" event={"ID":"128d4df9-9451-466a-a545-b916760c3c45","Type":"ContainerDied","Data":"81199d177bf34cbf27798b070987fc374b9e4a85e5dd80f4ecb0828d09ac4644"} Mar 18 12:27:27 crc kubenswrapper[4975]: I0318 12:27:27.941937 4975 generic.go:334] "Generic (PLEG): container finished" podID="128d4df9-9451-466a-a545-b916760c3c45" containerID="a941e37e624b5cdc1d9b8f271983c026878b6948983449ef71b54a1cc9c9b0f2" exitCode=0 Mar 18 12:27:27 crc kubenswrapper[4975]: I0318 12:27:27.941993 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" event={"ID":"128d4df9-9451-466a-a545-b916760c3c45","Type":"ContainerDied","Data":"a941e37e624b5cdc1d9b8f271983c026878b6948983449ef71b54a1cc9c9b0f2"} Mar 18 12:27:29 crc kubenswrapper[4975]: I0318 12:27:29.219916 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" Mar 18 12:27:29 crc kubenswrapper[4975]: I0318 12:27:29.296362 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/128d4df9-9451-466a-a545-b916760c3c45-util\") pod \"128d4df9-9451-466a-a545-b916760c3c45\" (UID: \"128d4df9-9451-466a-a545-b916760c3c45\") " Mar 18 12:27:29 crc kubenswrapper[4975]: I0318 12:27:29.296468 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7x2n\" (UniqueName: \"kubernetes.io/projected/128d4df9-9451-466a-a545-b916760c3c45-kube-api-access-r7x2n\") pod \"128d4df9-9451-466a-a545-b916760c3c45\" (UID: \"128d4df9-9451-466a-a545-b916760c3c45\") " Mar 18 12:27:29 crc kubenswrapper[4975]: I0318 12:27:29.296508 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/128d4df9-9451-466a-a545-b916760c3c45-bundle\") pod \"128d4df9-9451-466a-a545-b916760c3c45\" (UID: \"128d4df9-9451-466a-a545-b916760c3c45\") " Mar 18 12:27:29 crc kubenswrapper[4975]: I0318 12:27:29.297460 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128d4df9-9451-466a-a545-b916760c3c45-bundle" (OuterVolumeSpecName: "bundle") pod "128d4df9-9451-466a-a545-b916760c3c45" (UID: "128d4df9-9451-466a-a545-b916760c3c45"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:27:29 crc kubenswrapper[4975]: I0318 12:27:29.307071 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128d4df9-9451-466a-a545-b916760c3c45-kube-api-access-r7x2n" (OuterVolumeSpecName: "kube-api-access-r7x2n") pod "128d4df9-9451-466a-a545-b916760c3c45" (UID: "128d4df9-9451-466a-a545-b916760c3c45"). InnerVolumeSpecName "kube-api-access-r7x2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:27:29 crc kubenswrapper[4975]: I0318 12:27:29.358288 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128d4df9-9451-466a-a545-b916760c3c45-util" (OuterVolumeSpecName: "util") pod "128d4df9-9451-466a-a545-b916760c3c45" (UID: "128d4df9-9451-466a-a545-b916760c3c45"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:27:29 crc kubenswrapper[4975]: I0318 12:27:29.398128 4975 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/128d4df9-9451-466a-a545-b916760c3c45-util\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:29 crc kubenswrapper[4975]: I0318 12:27:29.398161 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7x2n\" (UniqueName: \"kubernetes.io/projected/128d4df9-9451-466a-a545-b916760c3c45-kube-api-access-r7x2n\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:29 crc kubenswrapper[4975]: I0318 12:27:29.398175 4975 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/128d4df9-9451-466a-a545-b916760c3c45-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:27:29 crc kubenswrapper[4975]: I0318 12:27:29.952771 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" event={"ID":"128d4df9-9451-466a-a545-b916760c3c45","Type":"ContainerDied","Data":"8c7aebbdcbc1af02c323a8576db94a9099412d3203d77bbfdb75f595cd984223"} Mar 18 12:27:29 crc kubenswrapper[4975]: I0318 12:27:29.952821 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c7aebbdcbc1af02c323a8576db94a9099412d3203d77bbfdb75f595cd984223" Mar 18 12:27:29 crc kubenswrapper[4975]: I0318 12:27:29.952837 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2" Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.197465 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-cd2hp"] Mar 18 12:27:31 crc kubenswrapper[4975]: E0318 12:27:31.197934 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128d4df9-9451-466a-a545-b916760c3c45" containerName="util" Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.197945 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="128d4df9-9451-466a-a545-b916760c3c45" containerName="util" Mar 18 12:27:31 crc kubenswrapper[4975]: E0318 12:27:31.197955 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128d4df9-9451-466a-a545-b916760c3c45" containerName="pull" Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.197960 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="128d4df9-9451-466a-a545-b916760c3c45" containerName="pull" Mar 18 12:27:31 crc kubenswrapper[4975]: E0318 12:27:31.197977 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128d4df9-9451-466a-a545-b916760c3c45" containerName="extract" Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.197983 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="128d4df9-9451-466a-a545-b916760c3c45" containerName="extract" Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.198098 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="128d4df9-9451-466a-a545-b916760c3c45" containerName="extract" Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.198510 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-cd2hp" Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.200449 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.201420 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-9fkb2" Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.201552 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.206385 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-cd2hp"] Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.220951 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh5zk\" (UniqueName: \"kubernetes.io/projected/c4901424-0b59-4410-8896-4868b5d83f75-kube-api-access-fh5zk\") pod \"nmstate-operator-796d4cfff4-cd2hp\" (UID: \"c4901424-0b59-4410-8896-4868b5d83f75\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-cd2hp" Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.321903 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh5zk\" (UniqueName: \"kubernetes.io/projected/c4901424-0b59-4410-8896-4868b5d83f75-kube-api-access-fh5zk\") pod \"nmstate-operator-796d4cfff4-cd2hp\" (UID: \"c4901424-0b59-4410-8896-4868b5d83f75\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-cd2hp" Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.339041 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh5zk\" (UniqueName: \"kubernetes.io/projected/c4901424-0b59-4410-8896-4868b5d83f75-kube-api-access-fh5zk\") pod \"nmstate-operator-796d4cfff4-cd2hp\" (UID: \"c4901424-0b59-4410-8896-4868b5d83f75\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-cd2hp" Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.512247 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-cd2hp" Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.683578 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-cd2hp"] Mar 18 12:27:31 crc kubenswrapper[4975]: I0318 12:27:31.963587 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-cd2hp" event={"ID":"c4901424-0b59-4410-8896-4868b5d83f75","Type":"ContainerStarted","Data":"7358a0e00d6c87bd10a423b9b8f277ea0b44db43f5aa7944e22d2208caa28e9c"} Mar 18 12:27:33 crc kubenswrapper[4975]: I0318 12:27:33.977814 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-cd2hp" event={"ID":"c4901424-0b59-4410-8896-4868b5d83f75","Type":"ContainerStarted","Data":"aabc8709d0a14d203587c333d44eaeeeba8c621a6e1a507387d715ec4cf22d35"} Mar 18 12:27:34 crc kubenswrapper[4975]: I0318 12:27:33.999291 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-cd2hp" podStartSLOduration=0.865494353 podStartE2EDuration="2.999273356s" podCreationTimestamp="2026-03-18 12:27:31 +0000 UTC" firstStartedPulling="2026-03-18 12:27:31.690371464 +0000 UTC m=+1037.404772043" lastFinishedPulling="2026-03-18 12:27:33.824150467 +0000 UTC m=+1039.538551046" observedRunningTime="2026-03-18 12:27:33.996029267 +0000 UTC m=+1039.710429866" watchObservedRunningTime="2026-03-18 12:27:33.999273356 +0000 UTC m=+1039.713673935" Mar 18 12:27:34 crc kubenswrapper[4975]: I0318 12:27:34.975441 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-k77j9"] Mar 18 12:27:34 crc kubenswrapper[4975]: I0318 12:27:34.976467 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-k77j9" Mar 18 12:27:34 crc kubenswrapper[4975]: I0318 12:27:34.980453 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-79ptf" Mar 18 12:27:34 crc kubenswrapper[4975]: I0318 12:27:34.994439 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-k77j9"] Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.003118 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5"] Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.004182 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.012230 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.068855 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6g99\" (UniqueName: \"kubernetes.io/projected/c7cf9899-3af1-426c-9f6d-8a157c4cdd02-kube-api-access-f6g99\") pod \"nmstate-metrics-9b8c8685d-k77j9\" (UID: \"c7cf9899-3af1-426c-9f6d-8a157c4cdd02\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-k77j9" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.068957 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwksk\" (UniqueName: \"kubernetes.io/projected/8f3d4654-fe47-469f-bda9-d8c111d8f22d-kube-api-access-fwksk\") pod \"nmstate-webhook-5f558f5558-qv8h5\" (UID: \"8f3d4654-fe47-469f-bda9-d8c111d8f22d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.069033 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8f3d4654-fe47-469f-bda9-d8c111d8f22d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-qv8h5\" (UID: \"8f3d4654-fe47-469f-bda9-d8c111d8f22d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.076458 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5"] Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.099923 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-gv79c"] Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.100809 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.170140 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eb225299-e366-4be1-8d6f-7419220bc147-nmstate-lock\") pod \"nmstate-handler-gv79c\" (UID: \"eb225299-e366-4be1-8d6f-7419220bc147\") " pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.170273 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8f3d4654-fe47-469f-bda9-d8c111d8f22d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-qv8h5\" (UID: \"8f3d4654-fe47-469f-bda9-d8c111d8f22d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.170313 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eb225299-e366-4be1-8d6f-7419220bc147-dbus-socket\") pod \"nmstate-handler-gv79c\" (UID: \"eb225299-e366-4be1-8d6f-7419220bc147\") " pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.170433 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eb225299-e366-4be1-8d6f-7419220bc147-ovs-socket\") pod \"nmstate-handler-gv79c\" (UID: \"eb225299-e366-4be1-8d6f-7419220bc147\") " pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:35 crc kubenswrapper[4975]: E0318 12:27:35.170458 4975 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.170502 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r57jd\" (UniqueName: \"kubernetes.io/projected/eb225299-e366-4be1-8d6f-7419220bc147-kube-api-access-r57jd\") pod \"nmstate-handler-gv79c\" (UID: \"eb225299-e366-4be1-8d6f-7419220bc147\") " pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:35 crc kubenswrapper[4975]: E0318 12:27:35.170547 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f3d4654-fe47-469f-bda9-d8c111d8f22d-tls-key-pair podName:8f3d4654-fe47-469f-bda9-d8c111d8f22d nodeName:}" failed. No retries permitted until 2026-03-18 12:27:35.670509505 +0000 UTC m=+1041.384910074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/8f3d4654-fe47-469f-bda9-d8c111d8f22d-tls-key-pair") pod "nmstate-webhook-5f558f5558-qv8h5" (UID: "8f3d4654-fe47-469f-bda9-d8c111d8f22d") : secret "openshift-nmstate-webhook" not found Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.170567 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6g99\" (UniqueName: \"kubernetes.io/projected/c7cf9899-3af1-426c-9f6d-8a157c4cdd02-kube-api-access-f6g99\") pod \"nmstate-metrics-9b8c8685d-k77j9\" (UID: \"c7cf9899-3af1-426c-9f6d-8a157c4cdd02\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-k77j9" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.170968 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwksk\" (UniqueName: \"kubernetes.io/projected/8f3d4654-fe47-469f-bda9-d8c111d8f22d-kube-api-access-fwksk\") pod \"nmstate-webhook-5f558f5558-qv8h5\" (UID: \"8f3d4654-fe47-469f-bda9-d8c111d8f22d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.191826 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6g99\" (UniqueName: \"kubernetes.io/projected/c7cf9899-3af1-426c-9f6d-8a157c4cdd02-kube-api-access-f6g99\") pod \"nmstate-metrics-9b8c8685d-k77j9\" (UID: \"c7cf9899-3af1-426c-9f6d-8a157c4cdd02\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-k77j9" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.203271 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwksk\" (UniqueName: \"kubernetes.io/projected/8f3d4654-fe47-469f-bda9-d8c111d8f22d-kube-api-access-fwksk\") pod \"nmstate-webhook-5f558f5558-qv8h5\" (UID: \"8f3d4654-fe47-469f-bda9-d8c111d8f22d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.216074 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q"] Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.216681 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.224512 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.224545 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-5l6zm" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.228553 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.249723 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q"] Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.276028 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r57jd\" (UniqueName: \"kubernetes.io/projected/eb225299-e366-4be1-8d6f-7419220bc147-kube-api-access-r57jd\") pod \"nmstate-handler-gv79c\" (UID: \"eb225299-e366-4be1-8d6f-7419220bc147\") " pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.276086 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eb225299-e366-4be1-8d6f-7419220bc147-nmstate-lock\") pod \"nmstate-handler-gv79c\" (UID: \"eb225299-e366-4be1-8d6f-7419220bc147\") " pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.276124 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eb225299-e366-4be1-8d6f-7419220bc147-dbus-socket\") pod \"nmstate-handler-gv79c\" (UID: \"eb225299-e366-4be1-8d6f-7419220bc147\") " pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.276162 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eb225299-e366-4be1-8d6f-7419220bc147-ovs-socket\") pod \"nmstate-handler-gv79c\" (UID: \"eb225299-e366-4be1-8d6f-7419220bc147\") " pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.276780 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eb225299-e366-4be1-8d6f-7419220bc147-nmstate-lock\") pod \"nmstate-handler-gv79c\" (UID: \"eb225299-e366-4be1-8d6f-7419220bc147\") " pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.277118 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eb225299-e366-4be1-8d6f-7419220bc147-dbus-socket\") pod \"nmstate-handler-gv79c\" (UID: \"eb225299-e366-4be1-8d6f-7419220bc147\") " pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.277148 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eb225299-e366-4be1-8d6f-7419220bc147-ovs-socket\") pod \"nmstate-handler-gv79c\" (UID: \"eb225299-e366-4be1-8d6f-7419220bc147\") " pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.292618 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-k77j9" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.305678 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r57jd\" (UniqueName: \"kubernetes.io/projected/eb225299-e366-4be1-8d6f-7419220bc147-kube-api-access-r57jd\") pod \"nmstate-handler-gv79c\" (UID: \"eb225299-e366-4be1-8d6f-7419220bc147\") " pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.377364 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsrlm\" (UniqueName: \"kubernetes.io/projected/7e7595f3-c2ff-4b57-90ad-3310743b2291-kube-api-access-qsrlm\") pod \"nmstate-console-plugin-86f58fcf4-nxq2q\" (UID: \"7e7595f3-c2ff-4b57-90ad-3310743b2291\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.377598 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7e7595f3-c2ff-4b57-90ad-3310743b2291-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-nxq2q\" (UID: \"7e7595f3-c2ff-4b57-90ad-3310743b2291\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.377657 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e7595f3-c2ff-4b57-90ad-3310743b2291-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-nxq2q\" (UID: \"7e7595f3-c2ff-4b57-90ad-3310743b2291\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.417132 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.425846 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-57f86f9565-2kscf"] Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.426746 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: W0318 12:27:35.440330 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb225299_e366_4be1_8d6f_7419220bc147.slice/crio-8a0f49d311131007f317a4055fa08bd949bab1b94b43d1a72ed24aa00a7e4743 WatchSource:0}: Error finding container 8a0f49d311131007f317a4055fa08bd949bab1b94b43d1a72ed24aa00a7e4743: Status 404 returned error can't find the container with id 8a0f49d311131007f317a4055fa08bd949bab1b94b43d1a72ed24aa00a7e4743 Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.451746 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57f86f9565-2kscf"] Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.490168 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e7595f3-c2ff-4b57-90ad-3310743b2291-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-nxq2q\" (UID: \"7e7595f3-c2ff-4b57-90ad-3310743b2291\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.490292 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsrlm\" (UniqueName: \"kubernetes.io/projected/7e7595f3-c2ff-4b57-90ad-3310743b2291-kube-api-access-qsrlm\") pod \"nmstate-console-plugin-86f58fcf4-nxq2q\" (UID: \"7e7595f3-c2ff-4b57-90ad-3310743b2291\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.490322 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7e7595f3-c2ff-4b57-90ad-3310743b2291-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-nxq2q\" (UID: \"7e7595f3-c2ff-4b57-90ad-3310743b2291\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.491304 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7e7595f3-c2ff-4b57-90ad-3310743b2291-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-nxq2q\" (UID: \"7e7595f3-c2ff-4b57-90ad-3310743b2291\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.494842 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e7595f3-c2ff-4b57-90ad-3310743b2291-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-nxq2q\" (UID: \"7e7595f3-c2ff-4b57-90ad-3310743b2291\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.519556 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsrlm\" (UniqueName: \"kubernetes.io/projected/7e7595f3-c2ff-4b57-90ad-3310743b2291-kube-api-access-qsrlm\") pod \"nmstate-console-plugin-86f58fcf4-nxq2q\" (UID: \"7e7595f3-c2ff-4b57-90ad-3310743b2291\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.547207 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.591027 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc858b0e-d796-451c-9a65-89e02640b901-console-oauth-config\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.591071 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc858b0e-d796-451c-9a65-89e02640b901-service-ca\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.591133 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc858b0e-d796-451c-9a65-89e02640b901-oauth-serving-cert\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.591171 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc858b0e-d796-451c-9a65-89e02640b901-trusted-ca-bundle\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.591232 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc858b0e-d796-451c-9a65-89e02640b901-console-config\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.591257 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc858b0e-d796-451c-9a65-89e02640b901-console-serving-cert\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.591283 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sgtd\" (UniqueName: \"kubernetes.io/projected/bc858b0e-d796-451c-9a65-89e02640b901-kube-api-access-9sgtd\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.692463 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc858b0e-d796-451c-9a65-89e02640b901-trusted-ca-bundle\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.692527 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc858b0e-d796-451c-9a65-89e02640b901-console-config\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.692545 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc858b0e-d796-451c-9a65-89e02640b901-console-serving-cert\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.692564 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sgtd\" (UniqueName: \"kubernetes.io/projected/bc858b0e-d796-451c-9a65-89e02640b901-kube-api-access-9sgtd\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.692588 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8f3d4654-fe47-469f-bda9-d8c111d8f22d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-qv8h5\" (UID: \"8f3d4654-fe47-469f-bda9-d8c111d8f22d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.692610 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc858b0e-d796-451c-9a65-89e02640b901-console-oauth-config\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.692634 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc858b0e-d796-451c-9a65-89e02640b901-service-ca\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.692679 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc858b0e-d796-451c-9a65-89e02640b901-oauth-serving-cert\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.693629 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc858b0e-d796-451c-9a65-89e02640b901-trusted-ca-bundle\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.693668 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc858b0e-d796-451c-9a65-89e02640b901-oauth-serving-cert\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.693633 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc858b0e-d796-451c-9a65-89e02640b901-console-config\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.693829 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc858b0e-d796-451c-9a65-89e02640b901-service-ca\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.699558 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc858b0e-d796-451c-9a65-89e02640b901-console-oauth-config\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.699630 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8f3d4654-fe47-469f-bda9-d8c111d8f22d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-qv8h5\" (UID: \"8f3d4654-fe47-469f-bda9-d8c111d8f22d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.702413 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc858b0e-d796-451c-9a65-89e02640b901-console-serving-cert\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.711181 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sgtd\" (UniqueName: \"kubernetes.io/projected/bc858b0e-d796-451c-9a65-89e02640b901-kube-api-access-9sgtd\") pod \"console-57f86f9565-2kscf\" (UID: \"bc858b0e-d796-451c-9a65-89e02640b901\") " pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.768489 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q"] Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.819503 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.857779 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-k77j9"] Mar 18 12:27:35 crc kubenswrapper[4975]: W0318 12:27:35.861304 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7cf9899_3af1_426c_9f6d_8a157c4cdd02.slice/crio-3e86893ddc629c30e25333c3d5d1283ca394fc92047f0c1c8e832c62bc79d8ea WatchSource:0}: Error finding container 3e86893ddc629c30e25333c3d5d1283ca394fc92047f0c1c8e832c62bc79d8ea: Status 404 returned error can't find the container with id 3e86893ddc629c30e25333c3d5d1283ca394fc92047f0c1c8e832c62bc79d8ea Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.926343 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5" Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.995734 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q" event={"ID":"7e7595f3-c2ff-4b57-90ad-3310743b2291","Type":"ContainerStarted","Data":"bfedda675a8e80aec5a2506f9dd214676238fafda8a99e33a334354c71b7e505"} Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.996927 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-gv79c" event={"ID":"eb225299-e366-4be1-8d6f-7419220bc147","Type":"ContainerStarted","Data":"8a0f49d311131007f317a4055fa08bd949bab1b94b43d1a72ed24aa00a7e4743"} Mar 18 12:27:35 crc kubenswrapper[4975]: I0318 12:27:35.997133 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57f86f9565-2kscf"] Mar 18 12:27:36 crc kubenswrapper[4975]: I0318 12:27:36.000103 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-k77j9" event={"ID":"c7cf9899-3af1-426c-9f6d-8a157c4cdd02","Type":"ContainerStarted","Data":"3e86893ddc629c30e25333c3d5d1283ca394fc92047f0c1c8e832c62bc79d8ea"} Mar 18 12:27:36 crc kubenswrapper[4975]: I0318 12:27:36.334472 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5"] Mar 18 12:27:36 crc kubenswrapper[4975]: W0318 12:27:36.342691 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f3d4654_fe47_469f_bda9_d8c111d8f22d.slice/crio-f59179cccc9df04948cba0d426aab6360245319e81b44b43199affe792f3e0bf WatchSource:0}: Error finding container f59179cccc9df04948cba0d426aab6360245319e81b44b43199affe792f3e0bf: Status 404 returned error can't find the container with id f59179cccc9df04948cba0d426aab6360245319e81b44b43199affe792f3e0bf Mar 18 12:27:37 crc kubenswrapper[4975]: I0318 12:27:37.008605 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57f86f9565-2kscf" event={"ID":"bc858b0e-d796-451c-9a65-89e02640b901","Type":"ContainerStarted","Data":"9c3f1fc40117692ad7a47588ed1b30b5804b22cd28c720a3251aa81942d07d6c"} Mar 18 12:27:37 crc kubenswrapper[4975]: I0318 12:27:37.009018 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57f86f9565-2kscf" event={"ID":"bc858b0e-d796-451c-9a65-89e02640b901","Type":"ContainerStarted","Data":"5f01c37b724855878808dcca75d843761c835abbe696145792ff09b7d51ca29f"} Mar 18 12:27:37 crc kubenswrapper[4975]: I0318 12:27:37.010246 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5" event={"ID":"8f3d4654-fe47-469f-bda9-d8c111d8f22d","Type":"ContainerStarted","Data":"f59179cccc9df04948cba0d426aab6360245319e81b44b43199affe792f3e0bf"} Mar 18 12:27:39 crc kubenswrapper[4975]: I0318 12:27:39.026689 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5" event={"ID":"8f3d4654-fe47-469f-bda9-d8c111d8f22d","Type":"ContainerStarted","Data":"8215b7765d5cfd6ed4f76ba991ba819f956e180846fcc6a4b3454c1d950d3eed"} Mar 18 12:27:39 crc kubenswrapper[4975]: I0318 12:27:39.028277 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5" Mar 18 12:27:39 crc kubenswrapper[4975]: I0318 12:27:39.029613 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q" event={"ID":"7e7595f3-c2ff-4b57-90ad-3310743b2291","Type":"ContainerStarted","Data":"7b6bdd91693bbdd799d243594f57ff9a14ac9f3ce05332e589e5f6771e767742"} Mar 18 12:27:39 crc kubenswrapper[4975]: I0318 12:27:39.032475 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-gv79c" event={"ID":"eb225299-e366-4be1-8d6f-7419220bc147","Type":"ContainerStarted","Data":"61428effe5289cd8f63a0ba0fce691438f1cbfd8f58f76330699d2c87499c4ac"} Mar 18 12:27:39 crc kubenswrapper[4975]: I0318 12:27:39.032650 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:39 crc kubenswrapper[4975]: I0318 12:27:39.033712 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-k77j9" event={"ID":"c7cf9899-3af1-426c-9f6d-8a157c4cdd02","Type":"ContainerStarted","Data":"ed7f046b94088f39928621414ffa6ff68b53ca7319bd88c5e95db43a01eb9b79"} Mar 18 12:27:39 crc kubenswrapper[4975]: I0318 12:27:39.056064 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57f86f9565-2kscf" podStartSLOduration=4.056032401 podStartE2EDuration="4.056032401s" podCreationTimestamp="2026-03-18 12:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:27:37.035479356 +0000 UTC m=+1042.749879935" watchObservedRunningTime="2026-03-18 12:27:39.056032401 +0000 UTC m=+1044.770432980" Mar 18 12:27:39 crc kubenswrapper[4975]: I0318 12:27:39.062688 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-gv79c" podStartSLOduration=0.875544675 podStartE2EDuration="4.062667732s" podCreationTimestamp="2026-03-18 12:27:35 +0000 UTC" firstStartedPulling="2026-03-18 12:27:35.443154061 +0000 UTC m=+1041.157554640" lastFinishedPulling="2026-03-18 12:27:38.630277118 +0000 UTC m=+1044.344677697" observedRunningTime="2026-03-18 12:27:39.062337633 +0000 UTC m=+1044.776738232" watchObservedRunningTime="2026-03-18 12:27:39.062667732 +0000 UTC m=+1044.777068311" Mar 18 12:27:39 crc kubenswrapper[4975]: I0318 12:27:39.066627 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5" podStartSLOduration=2.779726722 podStartE2EDuration="5.06660751s" podCreationTimestamp="2026-03-18 12:27:34 +0000 UTC" firstStartedPulling="2026-03-18 12:27:36.344245473 +0000 UTC m=+1042.058646052" lastFinishedPulling="2026-03-18 12:27:38.631126261 +0000 UTC m=+1044.345526840" observedRunningTime="2026-03-18 12:27:39.043676423 +0000 UTC m=+1044.758077022" watchObservedRunningTime="2026-03-18 12:27:39.06660751 +0000 UTC m=+1044.781008099" Mar 18 12:27:42 crc kubenswrapper[4975]: I0318 12:27:42.056280 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-k77j9" event={"ID":"c7cf9899-3af1-426c-9f6d-8a157c4cdd02","Type":"ContainerStarted","Data":"586190c6a3fa4603ce5879ef3d06f17cd025dd0f1b740697993b206dffa04cf9"} Mar 18 12:27:42 crc kubenswrapper[4975]: I0318 12:27:42.072201 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-k77j9" podStartSLOduration=2.756083855 podStartE2EDuration="8.072177743s" podCreationTimestamp="2026-03-18 12:27:34 +0000 UTC" firstStartedPulling="2026-03-18 12:27:35.862748055 +0000 UTC m=+1041.577148634" lastFinishedPulling="2026-03-18 12:27:41.178841943 +0000 UTC m=+1046.893242522" observedRunningTime="2026-03-18 12:27:42.07136903 +0000 UTC m=+1047.785769619" watchObservedRunningTime="2026-03-18 12:27:42.072177743 +0000 UTC m=+1047.786578322" Mar 18 12:27:42 crc kubenswrapper[4975]: I0318 12:27:42.076265 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nxq2q" podStartSLOduration=4.221749503 podStartE2EDuration="7.076235674s" podCreationTimestamp="2026-03-18 12:27:35 +0000 UTC" firstStartedPulling="2026-03-18 12:27:35.770296877 +0000 UTC m=+1041.484697456" lastFinishedPulling="2026-03-18 12:27:38.624783048 +0000 UTC m=+1044.339183627" observedRunningTime="2026-03-18 12:27:39.076640254 +0000 UTC m=+1044.791040833" watchObservedRunningTime="2026-03-18 12:27:42.076235674 +0000 UTC m=+1047.790636253" Mar 18 12:27:45 crc kubenswrapper[4975]: I0318 12:27:45.452682 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-gv79c" Mar 18 12:27:45 crc kubenswrapper[4975]: I0318 12:27:45.820496 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:45 crc kubenswrapper[4975]: I0318 12:27:45.821466 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:45 crc kubenswrapper[4975]: I0318 12:27:45.827031 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:46 crc kubenswrapper[4975]: I0318 12:27:46.110161 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57f86f9565-2kscf" Mar 18 12:27:46 crc kubenswrapper[4975]: I0318 12:27:46.162353 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-z69nv"] Mar 18 12:27:55 crc kubenswrapper[4975]: I0318 12:27:55.933240 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qv8h5" Mar 18 12:28:00 crc kubenswrapper[4975]: I0318 12:28:00.135464 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563948-hnfx6"] Mar 18 12:28:00 crc kubenswrapper[4975]: I0318 12:28:00.138272 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563948-hnfx6" Mar 18 12:28:00 crc kubenswrapper[4975]: I0318 12:28:00.143538 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:28:00 crc kubenswrapper[4975]: I0318 12:28:00.143538 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:28:00 crc kubenswrapper[4975]: I0318 12:28:00.143841 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:28:00 crc kubenswrapper[4975]: I0318 12:28:00.145604 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563948-hnfx6"] Mar 18 12:28:00 crc kubenswrapper[4975]: I0318 12:28:00.325751 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8ctm\" (UniqueName: \"kubernetes.io/projected/d745b1ee-2087-4b83-a5ea-5519ef205da0-kube-api-access-v8ctm\") pod \"auto-csr-approver-29563948-hnfx6\" (UID: \"d745b1ee-2087-4b83-a5ea-5519ef205da0\") " pod="openshift-infra/auto-csr-approver-29563948-hnfx6" Mar 18 12:28:00 crc kubenswrapper[4975]: I0318 12:28:00.427371 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8ctm\" (UniqueName: \"kubernetes.io/projected/d745b1ee-2087-4b83-a5ea-5519ef205da0-kube-api-access-v8ctm\") pod \"auto-csr-approver-29563948-hnfx6\" (UID: \"d745b1ee-2087-4b83-a5ea-5519ef205da0\") " pod="openshift-infra/auto-csr-approver-29563948-hnfx6" Mar 18 12:28:00 crc kubenswrapper[4975]: I0318 12:28:00.461842 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8ctm\" (UniqueName: \"kubernetes.io/projected/d745b1ee-2087-4b83-a5ea-5519ef205da0-kube-api-access-v8ctm\") pod \"auto-csr-approver-29563948-hnfx6\" (UID: \"d745b1ee-2087-4b83-a5ea-5519ef205da0\") " pod="openshift-infra/auto-csr-approver-29563948-hnfx6" Mar 18 12:28:00 crc kubenswrapper[4975]: I0318 12:28:00.760939 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563948-hnfx6" Mar 18 12:28:01 crc kubenswrapper[4975]: I0318 12:28:01.146193 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563948-hnfx6"] Mar 18 12:28:01 crc kubenswrapper[4975]: W0318 12:28:01.152917 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd745b1ee_2087_4b83_a5ea_5519ef205da0.slice/crio-79e7b873ab31dfc46de330dd73b12422f5635d2c5f7d0e3056b618f82a9fc740 WatchSource:0}: Error finding container 79e7b873ab31dfc46de330dd73b12422f5635d2c5f7d0e3056b618f82a9fc740: Status 404 returned error can't find the container with id 79e7b873ab31dfc46de330dd73b12422f5635d2c5f7d0e3056b618f82a9fc740 Mar 18 12:28:01 crc kubenswrapper[4975]: I0318 12:28:01.188111 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563948-hnfx6" event={"ID":"d745b1ee-2087-4b83-a5ea-5519ef205da0","Type":"ContainerStarted","Data":"79e7b873ab31dfc46de330dd73b12422f5635d2c5f7d0e3056b618f82a9fc740"} Mar 18 12:28:03 crc kubenswrapper[4975]: I0318 12:28:03.213732 4975 generic.go:334] "Generic (PLEG): container finished" podID="d745b1ee-2087-4b83-a5ea-5519ef205da0" containerID="43d38ad3d8849c53de8ed8f05e835cd685ee0a1b677e65c47e94f8d76b14ed2e" exitCode=0 Mar 18 12:28:03 crc kubenswrapper[4975]: I0318 12:28:03.213801 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563948-hnfx6" event={"ID":"d745b1ee-2087-4b83-a5ea-5519ef205da0","Type":"ContainerDied","Data":"43d38ad3d8849c53de8ed8f05e835cd685ee0a1b677e65c47e94f8d76b14ed2e"} Mar 18 12:28:04 crc kubenswrapper[4975]: I0318 12:28:04.441170 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563948-hnfx6" Mar 18 12:28:04 crc kubenswrapper[4975]: I0318 12:28:04.585531 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8ctm\" (UniqueName: \"kubernetes.io/projected/d745b1ee-2087-4b83-a5ea-5519ef205da0-kube-api-access-v8ctm\") pod \"d745b1ee-2087-4b83-a5ea-5519ef205da0\" (UID: \"d745b1ee-2087-4b83-a5ea-5519ef205da0\") " Mar 18 12:28:04 crc kubenswrapper[4975]: I0318 12:28:04.591299 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d745b1ee-2087-4b83-a5ea-5519ef205da0-kube-api-access-v8ctm" (OuterVolumeSpecName: "kube-api-access-v8ctm") pod "d745b1ee-2087-4b83-a5ea-5519ef205da0" (UID: "d745b1ee-2087-4b83-a5ea-5519ef205da0"). InnerVolumeSpecName "kube-api-access-v8ctm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:28:04 crc kubenswrapper[4975]: I0318 12:28:04.686629 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8ctm\" (UniqueName: \"kubernetes.io/projected/d745b1ee-2087-4b83-a5ea-5519ef205da0-kube-api-access-v8ctm\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:05 crc kubenswrapper[4975]: I0318 12:28:05.227446 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563948-hnfx6" event={"ID":"d745b1ee-2087-4b83-a5ea-5519ef205da0","Type":"ContainerDied","Data":"79e7b873ab31dfc46de330dd73b12422f5635d2c5f7d0e3056b618f82a9fc740"} Mar 18 12:28:05 crc kubenswrapper[4975]: I0318 12:28:05.227748 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79e7b873ab31dfc46de330dd73b12422f5635d2c5f7d0e3056b618f82a9fc740" Mar 18 12:28:05 crc kubenswrapper[4975]: I0318 12:28:05.227477 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563948-hnfx6" Mar 18 12:28:05 crc kubenswrapper[4975]: I0318 12:28:05.491019 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563942-nqstt"] Mar 18 12:28:05 crc kubenswrapper[4975]: I0318 12:28:05.493240 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563942-nqstt"] Mar 18 12:28:07 crc kubenswrapper[4975]: I0318 12:28:07.023838 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c77106cf-2987-45dc-ad75-98f5c4ae2fd7" path="/var/lib/kubelet/pods/c77106cf-2987-45dc-ad75-98f5c4ae2fd7/volumes" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.154148 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw"] Mar 18 12:28:08 crc kubenswrapper[4975]: E0318 12:28:08.154405 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d745b1ee-2087-4b83-a5ea-5519ef205da0" containerName="oc" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.154417 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d745b1ee-2087-4b83-a5ea-5519ef205da0" containerName="oc" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.154540 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="d745b1ee-2087-4b83-a5ea-5519ef205da0" containerName="oc" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.155289 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.157890 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.166030 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw"] Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.334318 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0848ced-9548-4c63-826b-12e73deed42d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw\" (UID: \"b0848ced-9548-4c63-826b-12e73deed42d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.334366 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0848ced-9548-4c63-826b-12e73deed42d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw\" (UID: \"b0848ced-9548-4c63-826b-12e73deed42d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.334435 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v77f\" (UniqueName: \"kubernetes.io/projected/b0848ced-9548-4c63-826b-12e73deed42d-kube-api-access-5v77f\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw\" (UID: \"b0848ced-9548-4c63-826b-12e73deed42d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.435585 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0848ced-9548-4c63-826b-12e73deed42d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw\" (UID: \"b0848ced-9548-4c63-826b-12e73deed42d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.435657 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0848ced-9548-4c63-826b-12e73deed42d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw\" (UID: \"b0848ced-9548-4c63-826b-12e73deed42d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.435698 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v77f\" (UniqueName: \"kubernetes.io/projected/b0848ced-9548-4c63-826b-12e73deed42d-kube-api-access-5v77f\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw\" (UID: \"b0848ced-9548-4c63-826b-12e73deed42d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.436257 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0848ced-9548-4c63-826b-12e73deed42d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw\" (UID: \"b0848ced-9548-4c63-826b-12e73deed42d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.436482 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0848ced-9548-4c63-826b-12e73deed42d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw\" (UID: \"b0848ced-9548-4c63-826b-12e73deed42d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.458048 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v77f\" (UniqueName: \"kubernetes.io/projected/b0848ced-9548-4c63-826b-12e73deed42d-kube-api-access-5v77f\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw\" (UID: \"b0848ced-9548-4c63-826b-12e73deed42d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.474794 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" Mar 18 12:28:08 crc kubenswrapper[4975]: I0318 12:28:08.888415 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw"] Mar 18 12:28:09 crc kubenswrapper[4975]: I0318 12:28:09.251368 4975 generic.go:334] "Generic (PLEG): container finished" podID="b0848ced-9548-4c63-826b-12e73deed42d" containerID="9e5fed0d5c5b76fdcbb2ebbe4a5375a85bcc98a38a0afbe99377b444eaa812e8" exitCode=0 Mar 18 12:28:09 crc kubenswrapper[4975]: I0318 12:28:09.251409 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" event={"ID":"b0848ced-9548-4c63-826b-12e73deed42d","Type":"ContainerDied","Data":"9e5fed0d5c5b76fdcbb2ebbe4a5375a85bcc98a38a0afbe99377b444eaa812e8"} Mar 18 12:28:09 crc kubenswrapper[4975]: I0318 12:28:09.251431 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" event={"ID":"b0848ced-9548-4c63-826b-12e73deed42d","Type":"ContainerStarted","Data":"b3e5fc9d1a7cf0a37ad5313b8ff63c37c779dca7319f31833defe61cb86c59d8"} Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.202617 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-z69nv" podUID="311fa18b-fde1-4390-9682-75c836813f88" containerName="console" containerID="cri-o://3012d584f83eb6aafb3fef46800471cc0de67f79de71722efff466d66f5b5c5c" gracePeriod=15 Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.267030 4975 generic.go:334] "Generic (PLEG): container finished" podID="b0848ced-9548-4c63-826b-12e73deed42d" containerID="f50d1d6cbb32cb13435c1fb6db8cf33ed43aace8cd04edb0fb3412063138444b" exitCode=0 Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.267073 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" event={"ID":"b0848ced-9548-4c63-826b-12e73deed42d","Type":"ContainerDied","Data":"f50d1d6cbb32cb13435c1fb6db8cf33ed43aace8cd04edb0fb3412063138444b"} Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.553060 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-z69nv_311fa18b-fde1-4390-9682-75c836813f88/console/0.log" Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.553505 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.676404 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs9ss\" (UniqueName: \"kubernetes.io/projected/311fa18b-fde1-4390-9682-75c836813f88-kube-api-access-cs9ss\") pod \"311fa18b-fde1-4390-9682-75c836813f88\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.676470 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-console-config\") pod \"311fa18b-fde1-4390-9682-75c836813f88\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.676491 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-service-ca\") pod \"311fa18b-fde1-4390-9682-75c836813f88\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.676525 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-oauth-serving-cert\") pod \"311fa18b-fde1-4390-9682-75c836813f88\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.676548 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-trusted-ca-bundle\") pod \"311fa18b-fde1-4390-9682-75c836813f88\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.676577 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/311fa18b-fde1-4390-9682-75c836813f88-console-serving-cert\") pod \"311fa18b-fde1-4390-9682-75c836813f88\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.676629 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/311fa18b-fde1-4390-9682-75c836813f88-console-oauth-config\") pod \"311fa18b-fde1-4390-9682-75c836813f88\" (UID: \"311fa18b-fde1-4390-9682-75c836813f88\") " Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.677299 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-console-config" (OuterVolumeSpecName: "console-config") pod "311fa18b-fde1-4390-9682-75c836813f88" (UID: "311fa18b-fde1-4390-9682-75c836813f88"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.677310 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-service-ca" (OuterVolumeSpecName: "service-ca") pod "311fa18b-fde1-4390-9682-75c836813f88" (UID: "311fa18b-fde1-4390-9682-75c836813f88"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.677355 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "311fa18b-fde1-4390-9682-75c836813f88" (UID: "311fa18b-fde1-4390-9682-75c836813f88"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.677695 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "311fa18b-fde1-4390-9682-75c836813f88" (UID: "311fa18b-fde1-4390-9682-75c836813f88"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.681880 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311fa18b-fde1-4390-9682-75c836813f88-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "311fa18b-fde1-4390-9682-75c836813f88" (UID: "311fa18b-fde1-4390-9682-75c836813f88"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.681964 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/311fa18b-fde1-4390-9682-75c836813f88-kube-api-access-cs9ss" (OuterVolumeSpecName: "kube-api-access-cs9ss") pod "311fa18b-fde1-4390-9682-75c836813f88" (UID: "311fa18b-fde1-4390-9682-75c836813f88"). InnerVolumeSpecName "kube-api-access-cs9ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.682376 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311fa18b-fde1-4390-9682-75c836813f88-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "311fa18b-fde1-4390-9682-75c836813f88" (UID: "311fa18b-fde1-4390-9682-75c836813f88"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.778533 4975 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.778594 4975 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/311fa18b-fde1-4390-9682-75c836813f88-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.778608 4975 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/311fa18b-fde1-4390-9682-75c836813f88-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.778619 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs9ss\" (UniqueName: \"kubernetes.io/projected/311fa18b-fde1-4390-9682-75c836813f88-kube-api-access-cs9ss\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.778631 4975 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.778644 4975 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:11 crc kubenswrapper[4975]: I0318 12:28:11.778654 4975 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/311fa18b-fde1-4390-9682-75c836813f88-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:12 crc kubenswrapper[4975]: I0318 12:28:12.274828 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-z69nv_311fa18b-fde1-4390-9682-75c836813f88/console/0.log" Mar 18 12:28:12 crc kubenswrapper[4975]: I0318 12:28:12.274901 4975 generic.go:334] "Generic (PLEG): container finished" podID="311fa18b-fde1-4390-9682-75c836813f88" containerID="3012d584f83eb6aafb3fef46800471cc0de67f79de71722efff466d66f5b5c5c" exitCode=2 Mar 18 12:28:12 crc kubenswrapper[4975]: I0318 12:28:12.274956 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-z69nv" event={"ID":"311fa18b-fde1-4390-9682-75c836813f88","Type":"ContainerDied","Data":"3012d584f83eb6aafb3fef46800471cc0de67f79de71722efff466d66f5b5c5c"} Mar 18 12:28:12 crc kubenswrapper[4975]: I0318 12:28:12.274982 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-z69nv" event={"ID":"311fa18b-fde1-4390-9682-75c836813f88","Type":"ContainerDied","Data":"6e9368fcebd49c4ca14fd16f23260321c87a809486ce39310f1efa5b61c5e119"} Mar 18 12:28:12 crc kubenswrapper[4975]: I0318 12:28:12.274998 4975 scope.go:117] "RemoveContainer" containerID="3012d584f83eb6aafb3fef46800471cc0de67f79de71722efff466d66f5b5c5c" Mar 18 12:28:12 crc kubenswrapper[4975]: I0318 12:28:12.275028 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-z69nv" Mar 18 12:28:12 crc kubenswrapper[4975]: I0318 12:28:12.277800 4975 generic.go:334] "Generic (PLEG): container finished" podID="b0848ced-9548-4c63-826b-12e73deed42d" containerID="504f50a233264c5bb26a4d4580f991babca2a695b99f967b7ec67b557a0e7b85" exitCode=0 Mar 18 12:28:12 crc kubenswrapper[4975]: I0318 12:28:12.277848 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" event={"ID":"b0848ced-9548-4c63-826b-12e73deed42d","Type":"ContainerDied","Data":"504f50a233264c5bb26a4d4580f991babca2a695b99f967b7ec67b557a0e7b85"} Mar 18 12:28:12 crc kubenswrapper[4975]: I0318 12:28:12.291805 4975 scope.go:117] "RemoveContainer" containerID="3012d584f83eb6aafb3fef46800471cc0de67f79de71722efff466d66f5b5c5c" Mar 18 12:28:12 crc kubenswrapper[4975]: E0318 12:28:12.292353 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3012d584f83eb6aafb3fef46800471cc0de67f79de71722efff466d66f5b5c5c\": container with ID starting with 3012d584f83eb6aafb3fef46800471cc0de67f79de71722efff466d66f5b5c5c not found: ID does not exist" containerID="3012d584f83eb6aafb3fef46800471cc0de67f79de71722efff466d66f5b5c5c" Mar 18 12:28:12 crc kubenswrapper[4975]: I0318 12:28:12.292644 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3012d584f83eb6aafb3fef46800471cc0de67f79de71722efff466d66f5b5c5c"} err="failed to get container status \"3012d584f83eb6aafb3fef46800471cc0de67f79de71722efff466d66f5b5c5c\": rpc error: code = NotFound desc = could not find container \"3012d584f83eb6aafb3fef46800471cc0de67f79de71722efff466d66f5b5c5c\": container with ID starting with 3012d584f83eb6aafb3fef46800471cc0de67f79de71722efff466d66f5b5c5c not found: ID does not exist" Mar 18 12:28:12 crc kubenswrapper[4975]: I0318 12:28:12.308915 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-z69nv"] Mar 18 12:28:12 crc kubenswrapper[4975]: I0318 12:28:12.312080 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-z69nv"] Mar 18 12:28:13 crc kubenswrapper[4975]: I0318 12:28:13.025253 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="311fa18b-fde1-4390-9682-75c836813f88" path="/var/lib/kubelet/pods/311fa18b-fde1-4390-9682-75c836813f88/volumes" Mar 18 12:28:13 crc kubenswrapper[4975]: I0318 12:28:13.494063 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" Mar 18 12:28:13 crc kubenswrapper[4975]: I0318 12:28:13.609644 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v77f\" (UniqueName: \"kubernetes.io/projected/b0848ced-9548-4c63-826b-12e73deed42d-kube-api-access-5v77f\") pod \"b0848ced-9548-4c63-826b-12e73deed42d\" (UID: \"b0848ced-9548-4c63-826b-12e73deed42d\") " Mar 18 12:28:13 crc kubenswrapper[4975]: I0318 12:28:13.609723 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0848ced-9548-4c63-826b-12e73deed42d-util\") pod \"b0848ced-9548-4c63-826b-12e73deed42d\" (UID: \"b0848ced-9548-4c63-826b-12e73deed42d\") " Mar 18 12:28:13 crc kubenswrapper[4975]: I0318 12:28:13.609790 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0848ced-9548-4c63-826b-12e73deed42d-bundle\") pod \"b0848ced-9548-4c63-826b-12e73deed42d\" (UID: \"b0848ced-9548-4c63-826b-12e73deed42d\") " Mar 18 12:28:13 crc kubenswrapper[4975]: I0318 12:28:13.611113 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0848ced-9548-4c63-826b-12e73deed42d-bundle" (OuterVolumeSpecName: "bundle") pod "b0848ced-9548-4c63-826b-12e73deed42d" (UID: "b0848ced-9548-4c63-826b-12e73deed42d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:28:13 crc kubenswrapper[4975]: I0318 12:28:13.615126 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0848ced-9548-4c63-826b-12e73deed42d-kube-api-access-5v77f" (OuterVolumeSpecName: "kube-api-access-5v77f") pod "b0848ced-9548-4c63-826b-12e73deed42d" (UID: "b0848ced-9548-4c63-826b-12e73deed42d"). InnerVolumeSpecName "kube-api-access-5v77f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:28:13 crc kubenswrapper[4975]: I0318 12:28:13.623967 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0848ced-9548-4c63-826b-12e73deed42d-util" (OuterVolumeSpecName: "util") pod "b0848ced-9548-4c63-826b-12e73deed42d" (UID: "b0848ced-9548-4c63-826b-12e73deed42d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:28:13 crc kubenswrapper[4975]: I0318 12:28:13.711445 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v77f\" (UniqueName: \"kubernetes.io/projected/b0848ced-9548-4c63-826b-12e73deed42d-kube-api-access-5v77f\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:13 crc kubenswrapper[4975]: I0318 12:28:13.711494 4975 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0848ced-9548-4c63-826b-12e73deed42d-util\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:13 crc kubenswrapper[4975]: I0318 12:28:13.711517 4975 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0848ced-9548-4c63-826b-12e73deed42d-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:28:14 crc kubenswrapper[4975]: I0318 12:28:14.296314 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" event={"ID":"b0848ced-9548-4c63-826b-12e73deed42d","Type":"ContainerDied","Data":"b3e5fc9d1a7cf0a37ad5313b8ff63c37c779dca7319f31833defe61cb86c59d8"} Mar 18 12:28:14 crc kubenswrapper[4975]: I0318 12:28:14.296381 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3e5fc9d1a7cf0a37ad5313b8ff63c37c779dca7319f31833defe61cb86c59d8" Mar 18 12:28:14 crc kubenswrapper[4975]: I0318 12:28:14.296499 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.244183 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m"] Mar 18 12:28:23 crc kubenswrapper[4975]: E0318 12:28:23.244749 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0848ced-9548-4c63-826b-12e73deed42d" containerName="util" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.244765 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0848ced-9548-4c63-826b-12e73deed42d" containerName="util" Mar 18 12:28:23 crc kubenswrapper[4975]: E0318 12:28:23.244786 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311fa18b-fde1-4390-9682-75c836813f88" containerName="console" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.244794 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="311fa18b-fde1-4390-9682-75c836813f88" containerName="console" Mar 18 12:28:23 crc kubenswrapper[4975]: E0318 12:28:23.244814 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0848ced-9548-4c63-826b-12e73deed42d" containerName="extract" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.244822 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0848ced-9548-4c63-826b-12e73deed42d" containerName="extract" Mar 18 12:28:23 crc kubenswrapper[4975]: E0318 12:28:23.244834 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0848ced-9548-4c63-826b-12e73deed42d" containerName="pull" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.244841 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0848ced-9548-4c63-826b-12e73deed42d" containerName="pull" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.244982 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0848ced-9548-4c63-826b-12e73deed42d" containerName="extract" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.245001 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="311fa18b-fde1-4390-9682-75c836813f88" containerName="console" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.245486 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.247418 4975 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.248270 4975 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.249132 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.249465 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.249843 4975 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-qbvhd" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.256052 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m"] Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.333359 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpkc7\" (UniqueName: \"kubernetes.io/projected/ac116d37-f5ed-40e6-b688-cdb1079a6727-kube-api-access-hpkc7\") pod \"metallb-operator-controller-manager-86bbb6cbf8-c5s8m\" (UID: \"ac116d37-f5ed-40e6-b688-cdb1079a6727\") " pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.333491 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ac116d37-f5ed-40e6-b688-cdb1079a6727-webhook-cert\") pod \"metallb-operator-controller-manager-86bbb6cbf8-c5s8m\" (UID: \"ac116d37-f5ed-40e6-b688-cdb1079a6727\") " pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.333539 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ac116d37-f5ed-40e6-b688-cdb1079a6727-apiservice-cert\") pod \"metallb-operator-controller-manager-86bbb6cbf8-c5s8m\" (UID: \"ac116d37-f5ed-40e6-b688-cdb1079a6727\") " pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.434412 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpkc7\" (UniqueName: \"kubernetes.io/projected/ac116d37-f5ed-40e6-b688-cdb1079a6727-kube-api-access-hpkc7\") pod \"metallb-operator-controller-manager-86bbb6cbf8-c5s8m\" (UID: \"ac116d37-f5ed-40e6-b688-cdb1079a6727\") " pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.434512 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ac116d37-f5ed-40e6-b688-cdb1079a6727-webhook-cert\") pod \"metallb-operator-controller-manager-86bbb6cbf8-c5s8m\" (UID: \"ac116d37-f5ed-40e6-b688-cdb1079a6727\") " pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.434542 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ac116d37-f5ed-40e6-b688-cdb1079a6727-apiservice-cert\") pod \"metallb-operator-controller-manager-86bbb6cbf8-c5s8m\" (UID: \"ac116d37-f5ed-40e6-b688-cdb1079a6727\") " pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.442663 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ac116d37-f5ed-40e6-b688-cdb1079a6727-webhook-cert\") pod \"metallb-operator-controller-manager-86bbb6cbf8-c5s8m\" (UID: \"ac116d37-f5ed-40e6-b688-cdb1079a6727\") " pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.447611 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ac116d37-f5ed-40e6-b688-cdb1079a6727-apiservice-cert\") pod \"metallb-operator-controller-manager-86bbb6cbf8-c5s8m\" (UID: \"ac116d37-f5ed-40e6-b688-cdb1079a6727\") " pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.457825 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpkc7\" (UniqueName: \"kubernetes.io/projected/ac116d37-f5ed-40e6-b688-cdb1079a6727-kube-api-access-hpkc7\") pod \"metallb-operator-controller-manager-86bbb6cbf8-c5s8m\" (UID: \"ac116d37-f5ed-40e6-b688-cdb1079a6727\") " pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.563783 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.594641 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm"] Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.597600 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.599702 4975 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-6fqxz" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.600166 4975 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.600712 4975 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.622495 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm"] Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.745016 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f2nv\" (UniqueName: \"kubernetes.io/projected/c3919877-0c39-4a6c-862f-5e448870427f-kube-api-access-2f2nv\") pod \"metallb-operator-webhook-server-598b644cc-hmnjm\" (UID: \"c3919877-0c39-4a6c-862f-5e448870427f\") " pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.745977 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c3919877-0c39-4a6c-862f-5e448870427f-webhook-cert\") pod \"metallb-operator-webhook-server-598b644cc-hmnjm\" (UID: \"c3919877-0c39-4a6c-862f-5e448870427f\") " pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.746135 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c3919877-0c39-4a6c-862f-5e448870427f-apiservice-cert\") pod \"metallb-operator-webhook-server-598b644cc-hmnjm\" (UID: \"c3919877-0c39-4a6c-862f-5e448870427f\") " pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.839412 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m"] Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.848095 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c3919877-0c39-4a6c-862f-5e448870427f-apiservice-cert\") pod \"metallb-operator-webhook-server-598b644cc-hmnjm\" (UID: \"c3919877-0c39-4a6c-862f-5e448870427f\") " pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.848151 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f2nv\" (UniqueName: \"kubernetes.io/projected/c3919877-0c39-4a6c-862f-5e448870427f-kube-api-access-2f2nv\") pod \"metallb-operator-webhook-server-598b644cc-hmnjm\" (UID: \"c3919877-0c39-4a6c-862f-5e448870427f\") " pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.848191 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c3919877-0c39-4a6c-862f-5e448870427f-webhook-cert\") pod \"metallb-operator-webhook-server-598b644cc-hmnjm\" (UID: \"c3919877-0c39-4a6c-862f-5e448870427f\") " pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.853720 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c3919877-0c39-4a6c-862f-5e448870427f-apiservice-cert\") pod \"metallb-operator-webhook-server-598b644cc-hmnjm\" (UID: \"c3919877-0c39-4a6c-862f-5e448870427f\") " pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.853842 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c3919877-0c39-4a6c-862f-5e448870427f-webhook-cert\") pod \"metallb-operator-webhook-server-598b644cc-hmnjm\" (UID: \"c3919877-0c39-4a6c-862f-5e448870427f\") " pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.869002 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f2nv\" (UniqueName: \"kubernetes.io/projected/c3919877-0c39-4a6c-862f-5e448870427f-kube-api-access-2f2nv\") pod \"metallb-operator-webhook-server-598b644cc-hmnjm\" (UID: \"c3919877-0c39-4a6c-862f-5e448870427f\") " pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" Mar 18 12:28:23 crc kubenswrapper[4975]: I0318 12:28:23.947952 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" Mar 18 12:28:24 crc kubenswrapper[4975]: I0318 12:28:24.190306 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm"] Mar 18 12:28:24 crc kubenswrapper[4975]: W0318 12:28:24.195974 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3919877_0c39_4a6c_862f_5e448870427f.slice/crio-3eb9609a6b15c1a1b9f616408154b83a398570e16fd6c554cf2b26326157889a WatchSource:0}: Error finding container 3eb9609a6b15c1a1b9f616408154b83a398570e16fd6c554cf2b26326157889a: Status 404 returned error can't find the container with id 3eb9609a6b15c1a1b9f616408154b83a398570e16fd6c554cf2b26326157889a Mar 18 12:28:24 crc kubenswrapper[4975]: I0318 12:28:24.359594 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" event={"ID":"c3919877-0c39-4a6c-862f-5e448870427f","Type":"ContainerStarted","Data":"3eb9609a6b15c1a1b9f616408154b83a398570e16fd6c554cf2b26326157889a"} Mar 18 12:28:24 crc kubenswrapper[4975]: I0318 12:28:24.360940 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" event={"ID":"ac116d37-f5ed-40e6-b688-cdb1079a6727","Type":"ContainerStarted","Data":"e4d79a23b0a7478b735de2b63034446f8029614d0955df4361a13c146055825b"} Mar 18 12:28:31 crc kubenswrapper[4975]: I0318 12:28:31.424615 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" event={"ID":"c3919877-0c39-4a6c-862f-5e448870427f","Type":"ContainerStarted","Data":"e5e64dbccc872d358dc9f44dcc6ed4b341b0f370c414b914194610bfa7575dc0"} Mar 18 12:28:31 crc kubenswrapper[4975]: I0318 12:28:31.425226 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" Mar 18 12:28:31 crc kubenswrapper[4975]: I0318 12:28:31.427525 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" event={"ID":"ac116d37-f5ed-40e6-b688-cdb1079a6727","Type":"ContainerStarted","Data":"5a6ca2e598e2ec0bbe9fb64224d23961c65bfb9b1bc0ee595b54f5233e204f88"} Mar 18 12:28:31 crc kubenswrapper[4975]: I0318 12:28:31.428118 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" Mar 18 12:28:31 crc kubenswrapper[4975]: I0318 12:28:31.445488 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" podStartSLOduration=2.065998193 podStartE2EDuration="8.445467689s" podCreationTimestamp="2026-03-18 12:28:23 +0000 UTC" firstStartedPulling="2026-03-18 12:28:24.199130727 +0000 UTC m=+1089.913531306" lastFinishedPulling="2026-03-18 12:28:30.578600223 +0000 UTC m=+1096.293000802" observedRunningTime="2026-03-18 12:28:31.440715609 +0000 UTC m=+1097.155116198" watchObservedRunningTime="2026-03-18 12:28:31.445467689 +0000 UTC m=+1097.159868268" Mar 18 12:28:31 crc kubenswrapper[4975]: I0318 12:28:31.464168 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" podStartSLOduration=1.774770288 podStartE2EDuration="8.46414679s" podCreationTimestamp="2026-03-18 12:28:23 +0000 UTC" firstStartedPulling="2026-03-18 12:28:23.845545037 +0000 UTC m=+1089.559945616" lastFinishedPulling="2026-03-18 12:28:30.534921529 +0000 UTC m=+1096.249322118" observedRunningTime="2026-03-18 12:28:31.462767692 +0000 UTC m=+1097.177168281" watchObservedRunningTime="2026-03-18 12:28:31.46414679 +0000 UTC m=+1097.178547369" Mar 18 12:28:33 crc kubenswrapper[4975]: I0318 12:28:33.829577 4975 scope.go:117] "RemoveContainer" containerID="a098c1347c1b344b05abafccb65e4ef8704765d31e25534b2257fdd72cd1b93e" Mar 18 12:28:43 crc kubenswrapper[4975]: I0318 12:28:43.951839 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-598b644cc-hmnjm" Mar 18 12:28:55 crc kubenswrapper[4975]: I0318 12:28:55.539107 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:28:55 crc kubenswrapper[4975]: I0318 12:28:55.539650 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:29:03 crc kubenswrapper[4975]: I0318 12:29:03.567044 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-86bbb6cbf8-c5s8m" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.494748 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l"] Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.495442 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.497121 4975 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.498084 4975 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-577pw" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.498438 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-pslqs"] Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.500475 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.503638 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.503888 4975 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.512905 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l"] Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.585262 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-nngfh"] Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.586196 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-nngfh" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.589607 4975 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.590981 4975 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.591183 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.596664 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-kz6mt"] Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.597469 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-kz6mt" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.598244 4975 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-bzl75" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.601426 4975 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.606926 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-kz6mt"] Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.762776 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzhlj\" (UniqueName: \"kubernetes.io/projected/eda60759-e685-41fd-9d34-6b1afdc1a8b9-kube-api-access-xzhlj\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.762838 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a6d86bf-7828-418f-91f4-41df21916eb4-cert\") pod \"controller-7bb4cc7c98-kz6mt\" (UID: \"2a6d86bf-7828-418f-91f4-41df21916eb4\") " pod="metallb-system/controller-7bb4cc7c98-kz6mt" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.762911 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfklk\" (UniqueName: \"kubernetes.io/projected/d9770b1b-8549-4f2a-967f-e2e3e36f9c6c-kube-api-access-cfklk\") pod \"frr-k8s-webhook-server-bcc4b6f68-vqh9l\" (UID: \"d9770b1b-8549-4f2a-967f-e2e3e36f9c6c\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.762960 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a6d86bf-7828-418f-91f4-41df21916eb4-metrics-certs\") pod \"controller-7bb4cc7c98-kz6mt\" (UID: \"2a6d86bf-7828-418f-91f4-41df21916eb4\") " pod="metallb-system/controller-7bb4cc7c98-kz6mt" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.763000 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/38873f7c-5cbd-48c0-ba83-0d479218b7ac-metallb-excludel2\") pod \"speaker-nngfh\" (UID: \"38873f7c-5cbd-48c0-ba83-0d479218b7ac\") " pod="metallb-system/speaker-nngfh" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.763095 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eda60759-e685-41fd-9d34-6b1afdc1a8b9-metrics-certs\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.763132 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/eda60759-e685-41fd-9d34-6b1afdc1a8b9-frr-startup\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.763163 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/38873f7c-5cbd-48c0-ba83-0d479218b7ac-memberlist\") pod \"speaker-nngfh\" (UID: \"38873f7c-5cbd-48c0-ba83-0d479218b7ac\") " pod="metallb-system/speaker-nngfh" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.763191 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/eda60759-e685-41fd-9d34-6b1afdc1a8b9-metrics\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.763334 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38873f7c-5cbd-48c0-ba83-0d479218b7ac-metrics-certs\") pod \"speaker-nngfh\" (UID: \"38873f7c-5cbd-48c0-ba83-0d479218b7ac\") " pod="metallb-system/speaker-nngfh" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.763467 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9770b1b-8549-4f2a-967f-e2e3e36f9c6c-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-vqh9l\" (UID: \"d9770b1b-8549-4f2a-967f-e2e3e36f9c6c\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.763684 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dfgt\" (UniqueName: \"kubernetes.io/projected/2a6d86bf-7828-418f-91f4-41df21916eb4-kube-api-access-6dfgt\") pod \"controller-7bb4cc7c98-kz6mt\" (UID: \"2a6d86bf-7828-418f-91f4-41df21916eb4\") " pod="metallb-system/controller-7bb4cc7c98-kz6mt" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.763744 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/eda60759-e685-41fd-9d34-6b1afdc1a8b9-reloader\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.763775 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/eda60759-e685-41fd-9d34-6b1afdc1a8b9-frr-sockets\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.763812 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf9xz\" (UniqueName: \"kubernetes.io/projected/38873f7c-5cbd-48c0-ba83-0d479218b7ac-kube-api-access-zf9xz\") pod \"speaker-nngfh\" (UID: \"38873f7c-5cbd-48c0-ba83-0d479218b7ac\") " pod="metallb-system/speaker-nngfh" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.763886 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/eda60759-e685-41fd-9d34-6b1afdc1a8b9-frr-conf\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870096 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/38873f7c-5cbd-48c0-ba83-0d479218b7ac-metallb-excludel2\") pod \"speaker-nngfh\" (UID: \"38873f7c-5cbd-48c0-ba83-0d479218b7ac\") " pod="metallb-system/speaker-nngfh" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870159 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eda60759-e685-41fd-9d34-6b1afdc1a8b9-metrics-certs\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870182 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/eda60759-e685-41fd-9d34-6b1afdc1a8b9-frr-startup\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870213 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/38873f7c-5cbd-48c0-ba83-0d479218b7ac-memberlist\") pod \"speaker-nngfh\" (UID: \"38873f7c-5cbd-48c0-ba83-0d479218b7ac\") " pod="metallb-system/speaker-nngfh" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870233 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/eda60759-e685-41fd-9d34-6b1afdc1a8b9-metrics\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870267 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38873f7c-5cbd-48c0-ba83-0d479218b7ac-metrics-certs\") pod \"speaker-nngfh\" (UID: \"38873f7c-5cbd-48c0-ba83-0d479218b7ac\") " pod="metallb-system/speaker-nngfh" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870289 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9770b1b-8549-4f2a-967f-e2e3e36f9c6c-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-vqh9l\" (UID: \"d9770b1b-8549-4f2a-967f-e2e3e36f9c6c\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870317 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dfgt\" (UniqueName: \"kubernetes.io/projected/2a6d86bf-7828-418f-91f4-41df21916eb4-kube-api-access-6dfgt\") pod \"controller-7bb4cc7c98-kz6mt\" (UID: \"2a6d86bf-7828-418f-91f4-41df21916eb4\") " pod="metallb-system/controller-7bb4cc7c98-kz6mt" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870353 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/eda60759-e685-41fd-9d34-6b1afdc1a8b9-reloader\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870376 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/eda60759-e685-41fd-9d34-6b1afdc1a8b9-frr-sockets\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: E0318 12:29:04.870389 4975 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 12:29:04 crc kubenswrapper[4975]: E0318 12:29:04.870455 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38873f7c-5cbd-48c0-ba83-0d479218b7ac-memberlist podName:38873f7c-5cbd-48c0-ba83-0d479218b7ac nodeName:}" failed. No retries permitted until 2026-03-18 12:29:05.37043351 +0000 UTC m=+1131.084834089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/38873f7c-5cbd-48c0-ba83-0d479218b7ac-memberlist") pod "speaker-nngfh" (UID: "38873f7c-5cbd-48c0-ba83-0d479218b7ac") : secret "metallb-memberlist" not found Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870398 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf9xz\" (UniqueName: \"kubernetes.io/projected/38873f7c-5cbd-48c0-ba83-0d479218b7ac-kube-api-access-zf9xz\") pod \"speaker-nngfh\" (UID: \"38873f7c-5cbd-48c0-ba83-0d479218b7ac\") " pod="metallb-system/speaker-nngfh" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870797 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/eda60759-e685-41fd-9d34-6b1afdc1a8b9-frr-conf\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870829 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzhlj\" (UniqueName: \"kubernetes.io/projected/eda60759-e685-41fd-9d34-6b1afdc1a8b9-kube-api-access-xzhlj\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870830 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/eda60759-e685-41fd-9d34-6b1afdc1a8b9-reloader\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870852 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a6d86bf-7828-418f-91f4-41df21916eb4-cert\") pod \"controller-7bb4cc7c98-kz6mt\" (UID: \"2a6d86bf-7828-418f-91f4-41df21916eb4\") " pod="metallb-system/controller-7bb4cc7c98-kz6mt" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.870859 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/eda60759-e685-41fd-9d34-6b1afdc1a8b9-metrics\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.871113 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfklk\" (UniqueName: \"kubernetes.io/projected/d9770b1b-8549-4f2a-967f-e2e3e36f9c6c-kube-api-access-cfklk\") pod \"frr-k8s-webhook-server-bcc4b6f68-vqh9l\" (UID: \"d9770b1b-8549-4f2a-967f-e2e3e36f9c6c\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.871152 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a6d86bf-7828-418f-91f4-41df21916eb4-metrics-certs\") pod \"controller-7bb4cc7c98-kz6mt\" (UID: \"2a6d86bf-7828-418f-91f4-41df21916eb4\") " pod="metallb-system/controller-7bb4cc7c98-kz6mt" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.871181 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/eda60759-e685-41fd-9d34-6b1afdc1a8b9-frr-conf\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.871191 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/eda60759-e685-41fd-9d34-6b1afdc1a8b9-frr-startup\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.871371 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/eda60759-e685-41fd-9d34-6b1afdc1a8b9-frr-sockets\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.872507 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/38873f7c-5cbd-48c0-ba83-0d479218b7ac-metallb-excludel2\") pod \"speaker-nngfh\" (UID: \"38873f7c-5cbd-48c0-ba83-0d479218b7ac\") " pod="metallb-system/speaker-nngfh" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.876169 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eda60759-e685-41fd-9d34-6b1afdc1a8b9-metrics-certs\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.876224 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a6d86bf-7828-418f-91f4-41df21916eb4-cert\") pod \"controller-7bb4cc7c98-kz6mt\" (UID: \"2a6d86bf-7828-418f-91f4-41df21916eb4\") " pod="metallb-system/controller-7bb4cc7c98-kz6mt" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.876252 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9770b1b-8549-4f2a-967f-e2e3e36f9c6c-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-vqh9l\" (UID: \"d9770b1b-8549-4f2a-967f-e2e3e36f9c6c\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.877356 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a6d86bf-7828-418f-91f4-41df21916eb4-metrics-certs\") pod \"controller-7bb4cc7c98-kz6mt\" (UID: \"2a6d86bf-7828-418f-91f4-41df21916eb4\") " pod="metallb-system/controller-7bb4cc7c98-kz6mt" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.888630 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38873f7c-5cbd-48c0-ba83-0d479218b7ac-metrics-certs\") pod \"speaker-nngfh\" (UID: \"38873f7c-5cbd-48c0-ba83-0d479218b7ac\") " pod="metallb-system/speaker-nngfh" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.890701 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf9xz\" (UniqueName: \"kubernetes.io/projected/38873f7c-5cbd-48c0-ba83-0d479218b7ac-kube-api-access-zf9xz\") pod \"speaker-nngfh\" (UID: \"38873f7c-5cbd-48c0-ba83-0d479218b7ac\") " pod="metallb-system/speaker-nngfh" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.893559 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dfgt\" (UniqueName: \"kubernetes.io/projected/2a6d86bf-7828-418f-91f4-41df21916eb4-kube-api-access-6dfgt\") pod \"controller-7bb4cc7c98-kz6mt\" (UID: \"2a6d86bf-7828-418f-91f4-41df21916eb4\") " pod="metallb-system/controller-7bb4cc7c98-kz6mt" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.895721 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfklk\" (UniqueName: \"kubernetes.io/projected/d9770b1b-8549-4f2a-967f-e2e3e36f9c6c-kube-api-access-cfklk\") pod \"frr-k8s-webhook-server-bcc4b6f68-vqh9l\" (UID: \"d9770b1b-8549-4f2a-967f-e2e3e36f9c6c\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l" Mar 18 12:29:04 crc kubenswrapper[4975]: I0318 12:29:04.898995 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzhlj\" (UniqueName: \"kubernetes.io/projected/eda60759-e685-41fd-9d34-6b1afdc1a8b9-kube-api-access-xzhlj\") pod \"frr-k8s-pslqs\" (UID: \"eda60759-e685-41fd-9d34-6b1afdc1a8b9\") " pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:05 crc kubenswrapper[4975]: I0318 12:29:05.068633 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l" Mar 18 12:29:05 crc kubenswrapper[4975]: I0318 12:29:05.082177 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:05 crc kubenswrapper[4975]: I0318 12:29:05.099231 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-kz6mt" Mar 18 12:29:05 crc kubenswrapper[4975]: I0318 12:29:05.378468 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/38873f7c-5cbd-48c0-ba83-0d479218b7ac-memberlist\") pod \"speaker-nngfh\" (UID: \"38873f7c-5cbd-48c0-ba83-0d479218b7ac\") " pod="metallb-system/speaker-nngfh" Mar 18 12:29:05 crc kubenswrapper[4975]: E0318 12:29:05.378619 4975 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 12:29:05 crc kubenswrapper[4975]: E0318 12:29:05.378945 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38873f7c-5cbd-48c0-ba83-0d479218b7ac-memberlist podName:38873f7c-5cbd-48c0-ba83-0d479218b7ac nodeName:}" failed. No retries permitted until 2026-03-18 12:29:06.378905591 +0000 UTC m=+1132.093306170 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/38873f7c-5cbd-48c0-ba83-0d479218b7ac-memberlist") pod "speaker-nngfh" (UID: "38873f7c-5cbd-48c0-ba83-0d479218b7ac") : secret "metallb-memberlist" not found Mar 18 12:29:05 crc kubenswrapper[4975]: I0318 12:29:05.556704 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l"] Mar 18 12:29:05 crc kubenswrapper[4975]: I0318 12:29:05.633928 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-kz6mt"] Mar 18 12:29:05 crc kubenswrapper[4975]: W0318 12:29:05.636520 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a6d86bf_7828_418f_91f4_41df21916eb4.slice/crio-a98023d2ccff226e5351bf4a35a7fee573cc2ea036017cf54b65a2f1aaca595f WatchSource:0}: Error finding container a98023d2ccff226e5351bf4a35a7fee573cc2ea036017cf54b65a2f1aaca595f: Status 404 returned error can't find the container with id a98023d2ccff226e5351bf4a35a7fee573cc2ea036017cf54b65a2f1aaca595f Mar 18 12:29:05 crc kubenswrapper[4975]: I0318 12:29:05.782770 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-kz6mt" event={"ID":"2a6d86bf-7828-418f-91f4-41df21916eb4","Type":"ContainerStarted","Data":"39bb7f8bc7cbf7b87bc9e743251ecdb3d736e2778a5c2f68c2b37745458f578a"} Mar 18 12:29:05 crc kubenswrapper[4975]: I0318 12:29:05.783137 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-kz6mt" event={"ID":"2a6d86bf-7828-418f-91f4-41df21916eb4","Type":"ContainerStarted","Data":"a98023d2ccff226e5351bf4a35a7fee573cc2ea036017cf54b65a2f1aaca595f"} Mar 18 12:29:05 crc kubenswrapper[4975]: I0318 12:29:05.785040 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l" event={"ID":"d9770b1b-8549-4f2a-967f-e2e3e36f9c6c","Type":"ContainerStarted","Data":"9264c4248835bf79af66b8e23a647f6ca4d0695c6001bb5d0b24192bad196698"} Mar 18 12:29:05 crc kubenswrapper[4975]: I0318 12:29:05.786081 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pslqs" event={"ID":"eda60759-e685-41fd-9d34-6b1afdc1a8b9","Type":"ContainerStarted","Data":"4cce0365fa3bcf96612dfa5e901fde0d54556ecf907d97f79820a52aa5609aee"} Mar 18 12:29:06 crc kubenswrapper[4975]: I0318 12:29:06.392390 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/38873f7c-5cbd-48c0-ba83-0d479218b7ac-memberlist\") pod \"speaker-nngfh\" (UID: \"38873f7c-5cbd-48c0-ba83-0d479218b7ac\") " pod="metallb-system/speaker-nngfh" Mar 18 12:29:06 crc kubenswrapper[4975]: I0318 12:29:06.398755 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/38873f7c-5cbd-48c0-ba83-0d479218b7ac-memberlist\") pod \"speaker-nngfh\" (UID: \"38873f7c-5cbd-48c0-ba83-0d479218b7ac\") " pod="metallb-system/speaker-nngfh" Mar 18 12:29:06 crc kubenswrapper[4975]: I0318 12:29:06.591680 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-nngfh" Mar 18 12:29:06 crc kubenswrapper[4975]: W0318 12:29:06.626057 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38873f7c_5cbd_48c0_ba83_0d479218b7ac.slice/crio-a52490d16bf15c530ec75dad16e2750b1e4a5019f087ea807a1c2670f3baf8b8 WatchSource:0}: Error finding container a52490d16bf15c530ec75dad16e2750b1e4a5019f087ea807a1c2670f3baf8b8: Status 404 returned error can't find the container with id a52490d16bf15c530ec75dad16e2750b1e4a5019f087ea807a1c2670f3baf8b8 Mar 18 12:29:06 crc kubenswrapper[4975]: I0318 12:29:06.795907 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-kz6mt" event={"ID":"2a6d86bf-7828-418f-91f4-41df21916eb4","Type":"ContainerStarted","Data":"3fd21d8d4fca3f2f3281911ccc55bc51c905462de6d17f84d67bb6b1072dfc99"} Mar 18 12:29:06 crc kubenswrapper[4975]: I0318 12:29:06.796664 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-kz6mt" Mar 18 12:29:06 crc kubenswrapper[4975]: I0318 12:29:06.799103 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nngfh" event={"ID":"38873f7c-5cbd-48c0-ba83-0d479218b7ac","Type":"ContainerStarted","Data":"a52490d16bf15c530ec75dad16e2750b1e4a5019f087ea807a1c2670f3baf8b8"} Mar 18 12:29:06 crc kubenswrapper[4975]: I0318 12:29:06.839082 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-kz6mt" podStartSLOduration=2.839058473 podStartE2EDuration="2.839058473s" podCreationTimestamp="2026-03-18 12:29:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:29:06.838366284 +0000 UTC m=+1132.552766863" watchObservedRunningTime="2026-03-18 12:29:06.839058473 +0000 UTC m=+1132.553459052" Mar 18 12:29:07 crc kubenswrapper[4975]: I0318 12:29:07.810074 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nngfh" event={"ID":"38873f7c-5cbd-48c0-ba83-0d479218b7ac","Type":"ContainerStarted","Data":"b3630a7a8822784684265e0daff73c25cbbaae661df02f299a48fc3c9a7d3281"} Mar 18 12:29:07 crc kubenswrapper[4975]: I0318 12:29:07.810416 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nngfh" event={"ID":"38873f7c-5cbd-48c0-ba83-0d479218b7ac","Type":"ContainerStarted","Data":"fa443c1cc20f1ada49aa344dfd90f87da859798b4e26ae7b559e5d0bcef3095f"} Mar 18 12:29:08 crc kubenswrapper[4975]: I0318 12:29:08.819685 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-nngfh" Mar 18 12:29:14 crc kubenswrapper[4975]: I0318 12:29:14.977237 4975 generic.go:334] "Generic (PLEG): container finished" podID="eda60759-e685-41fd-9d34-6b1afdc1a8b9" containerID="3bcc25f7b13d96c6e2e71e3455ecd1103d348e03ab439e4e3249424b8669f097" exitCode=0 Mar 18 12:29:14 crc kubenswrapper[4975]: I0318 12:29:14.977700 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pslqs" event={"ID":"eda60759-e685-41fd-9d34-6b1afdc1a8b9","Type":"ContainerDied","Data":"3bcc25f7b13d96c6e2e71e3455ecd1103d348e03ab439e4e3249424b8669f097"} Mar 18 12:29:14 crc kubenswrapper[4975]: I0318 12:29:14.980109 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l" event={"ID":"d9770b1b-8549-4f2a-967f-e2e3e36f9c6c","Type":"ContainerStarted","Data":"f3cbe7154627d5138efdd79b3ac73491b39f8d249ee802cc44620afaf682e13f"} Mar 18 12:29:14 crc kubenswrapper[4975]: I0318 12:29:14.980480 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l" Mar 18 12:29:15 crc kubenswrapper[4975]: I0318 12:29:15.006751 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-nngfh" podStartSLOduration=11.006734598 podStartE2EDuration="11.006734598s" podCreationTimestamp="2026-03-18 12:29:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:29:07.841943251 +0000 UTC m=+1133.556343820" watchObservedRunningTime="2026-03-18 12:29:15.006734598 +0000 UTC m=+1140.721135177" Mar 18 12:29:15 crc kubenswrapper[4975]: I0318 12:29:15.104787 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-kz6mt" Mar 18 12:29:15 crc kubenswrapper[4975]: I0318 12:29:15.129549 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l" podStartSLOduration=1.935438363 podStartE2EDuration="11.12953131s" podCreationTimestamp="2026-03-18 12:29:04 +0000 UTC" firstStartedPulling="2026-03-18 12:29:05.567423737 +0000 UTC m=+1131.281824316" lastFinishedPulling="2026-03-18 12:29:14.761516694 +0000 UTC m=+1140.475917263" observedRunningTime="2026-03-18 12:29:15.020289148 +0000 UTC m=+1140.734689737" watchObservedRunningTime="2026-03-18 12:29:15.12953131 +0000 UTC m=+1140.843931889" Mar 18 12:29:15 crc kubenswrapper[4975]: I0318 12:29:15.987396 4975 generic.go:334] "Generic (PLEG): container finished" podID="eda60759-e685-41fd-9d34-6b1afdc1a8b9" containerID="f276e1843ad4b8a97c08350d8017609f34c41dd7e4654004ad8ad0ec56bbb71a" exitCode=0 Mar 18 12:29:15 crc kubenswrapper[4975]: I0318 12:29:15.988787 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pslqs" event={"ID":"eda60759-e685-41fd-9d34-6b1afdc1a8b9","Type":"ContainerDied","Data":"f276e1843ad4b8a97c08350d8017609f34c41dd7e4654004ad8ad0ec56bbb71a"} Mar 18 12:29:16 crc kubenswrapper[4975]: I0318 12:29:16.596090 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-nngfh" Mar 18 12:29:17 crc kubenswrapper[4975]: I0318 12:29:17.003390 4975 generic.go:334] "Generic (PLEG): container finished" podID="eda60759-e685-41fd-9d34-6b1afdc1a8b9" containerID="f075d729d20cdea6a4469747d56a2155332f4cf42a3d1b137b28a127344f799a" exitCode=0 Mar 18 12:29:17 crc kubenswrapper[4975]: I0318 12:29:17.003649 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pslqs" event={"ID":"eda60759-e685-41fd-9d34-6b1afdc1a8b9","Type":"ContainerDied","Data":"f075d729d20cdea6a4469747d56a2155332f4cf42a3d1b137b28a127344f799a"} Mar 18 12:29:18 crc kubenswrapper[4975]: I0318 12:29:18.019770 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pslqs" event={"ID":"eda60759-e685-41fd-9d34-6b1afdc1a8b9","Type":"ContainerStarted","Data":"4ef70e720643da4ce824b857519cb6a836eb95e7d397eaafad4403ae59af8c79"} Mar 18 12:29:18 crc kubenswrapper[4975]: I0318 12:29:18.020174 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pslqs" event={"ID":"eda60759-e685-41fd-9d34-6b1afdc1a8b9","Type":"ContainerStarted","Data":"97ee72e36ac65801ddd61c65dd4fe16c2d449837fbf4fc5260e7f9efa8ff7a6b"} Mar 18 12:29:18 crc kubenswrapper[4975]: I0318 12:29:18.020191 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pslqs" event={"ID":"eda60759-e685-41fd-9d34-6b1afdc1a8b9","Type":"ContainerStarted","Data":"c2633157b132c68c117a5ab66a1ba8c43d1177c6206170868d69db10ba998d35"} Mar 18 12:29:18 crc kubenswrapper[4975]: I0318 12:29:18.020203 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pslqs" event={"ID":"eda60759-e685-41fd-9d34-6b1afdc1a8b9","Type":"ContainerStarted","Data":"39326c19600e0ade2bbe07624d2619c2099ee510394b37419fc30637d699f19a"} Mar 18 12:29:18 crc kubenswrapper[4975]: I0318 12:29:18.020214 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pslqs" event={"ID":"eda60759-e685-41fd-9d34-6b1afdc1a8b9","Type":"ContainerStarted","Data":"f881c0e013c5fd93b43aa47b5f6815be59cc69d2c4fae172fd3c606d501f53b3"} Mar 18 12:29:19 crc kubenswrapper[4975]: I0318 12:29:19.029333 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pslqs" event={"ID":"eda60759-e685-41fd-9d34-6b1afdc1a8b9","Type":"ContainerStarted","Data":"c28ac2b85961c25e3cf2575d7faf0ffd0db2894871ca725e46c2739577619107"} Mar 18 12:29:19 crc kubenswrapper[4975]: I0318 12:29:19.030399 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:19 crc kubenswrapper[4975]: I0318 12:29:19.058803 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-pslqs" podStartSLOduration=5.604324943 podStartE2EDuration="15.058786658s" podCreationTimestamp="2026-03-18 12:29:04 +0000 UTC" firstStartedPulling="2026-03-18 12:29:05.301097407 +0000 UTC m=+1131.015497996" lastFinishedPulling="2026-03-18 12:29:14.755559132 +0000 UTC m=+1140.469959711" observedRunningTime="2026-03-18 12:29:19.056848985 +0000 UTC m=+1144.771249574" watchObservedRunningTime="2026-03-18 12:29:19.058786658 +0000 UTC m=+1144.773187237" Mar 18 12:29:19 crc kubenswrapper[4975]: I0318 12:29:19.626447 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jj6vl"] Mar 18 12:29:19 crc kubenswrapper[4975]: I0318 12:29:19.627309 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jj6vl" Mar 18 12:29:19 crc kubenswrapper[4975]: I0318 12:29:19.628994 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-xvsbd" Mar 18 12:29:19 crc kubenswrapper[4975]: I0318 12:29:19.629157 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 18 12:29:19 crc kubenswrapper[4975]: I0318 12:29:19.629205 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 18 12:29:19 crc kubenswrapper[4975]: I0318 12:29:19.648497 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jj6vl"] Mar 18 12:29:19 crc kubenswrapper[4975]: I0318 12:29:19.773520 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tmtz\" (UniqueName: \"kubernetes.io/projected/fcac19af-9e0b-4294-b0f7-2037471e883a-kube-api-access-6tmtz\") pod \"openstack-operator-index-jj6vl\" (UID: \"fcac19af-9e0b-4294-b0f7-2037471e883a\") " pod="openstack-operators/openstack-operator-index-jj6vl" Mar 18 12:29:19 crc kubenswrapper[4975]: I0318 12:29:19.875142 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tmtz\" (UniqueName: \"kubernetes.io/projected/fcac19af-9e0b-4294-b0f7-2037471e883a-kube-api-access-6tmtz\") pod \"openstack-operator-index-jj6vl\" (UID: \"fcac19af-9e0b-4294-b0f7-2037471e883a\") " pod="openstack-operators/openstack-operator-index-jj6vl" Mar 18 12:29:19 crc kubenswrapper[4975]: I0318 12:29:19.900119 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tmtz\" (UniqueName: \"kubernetes.io/projected/fcac19af-9e0b-4294-b0f7-2037471e883a-kube-api-access-6tmtz\") pod \"openstack-operator-index-jj6vl\" (UID: \"fcac19af-9e0b-4294-b0f7-2037471e883a\") " pod="openstack-operators/openstack-operator-index-jj6vl" Mar 18 12:29:19 crc kubenswrapper[4975]: I0318 12:29:19.944971 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jj6vl" Mar 18 12:29:20 crc kubenswrapper[4975]: I0318 12:29:20.082490 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:20 crc kubenswrapper[4975]: I0318 12:29:20.162201 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:20 crc kubenswrapper[4975]: I0318 12:29:20.372679 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jj6vl"] Mar 18 12:29:20 crc kubenswrapper[4975]: W0318 12:29:20.382238 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcac19af_9e0b_4294_b0f7_2037471e883a.slice/crio-3c26015132c6a63265e5f2b008910bd64b5a16f638705a243ada70e4b0dc161e WatchSource:0}: Error finding container 3c26015132c6a63265e5f2b008910bd64b5a16f638705a243ada70e4b0dc161e: Status 404 returned error can't find the container with id 3c26015132c6a63265e5f2b008910bd64b5a16f638705a243ada70e4b0dc161e Mar 18 12:29:21 crc kubenswrapper[4975]: I0318 12:29:21.043240 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jj6vl" event={"ID":"fcac19af-9e0b-4294-b0f7-2037471e883a","Type":"ContainerStarted","Data":"3c26015132c6a63265e5f2b008910bd64b5a16f638705a243ada70e4b0dc161e"} Mar 18 12:29:23 crc kubenswrapper[4975]: I0318 12:29:23.010136 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jj6vl"] Mar 18 12:29:23 crc kubenswrapper[4975]: I0318 12:29:23.613227 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-s7jn8"] Mar 18 12:29:23 crc kubenswrapper[4975]: I0318 12:29:23.614347 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s7jn8" Mar 18 12:29:23 crc kubenswrapper[4975]: I0318 12:29:23.628692 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s7jn8"] Mar 18 12:29:23 crc kubenswrapper[4975]: I0318 12:29:23.664683 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2c72\" (UniqueName: \"kubernetes.io/projected/3b9db44a-8c71-40d7-b3b5-b70cd7b32bdf-kube-api-access-q2c72\") pod \"openstack-operator-index-s7jn8\" (UID: \"3b9db44a-8c71-40d7-b3b5-b70cd7b32bdf\") " pod="openstack-operators/openstack-operator-index-s7jn8" Mar 18 12:29:23 crc kubenswrapper[4975]: I0318 12:29:23.765624 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2c72\" (UniqueName: \"kubernetes.io/projected/3b9db44a-8c71-40d7-b3b5-b70cd7b32bdf-kube-api-access-q2c72\") pod \"openstack-operator-index-s7jn8\" (UID: \"3b9db44a-8c71-40d7-b3b5-b70cd7b32bdf\") " pod="openstack-operators/openstack-operator-index-s7jn8" Mar 18 12:29:23 crc kubenswrapper[4975]: I0318 12:29:23.785420 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2c72\" (UniqueName: \"kubernetes.io/projected/3b9db44a-8c71-40d7-b3b5-b70cd7b32bdf-kube-api-access-q2c72\") pod \"openstack-operator-index-s7jn8\" (UID: \"3b9db44a-8c71-40d7-b3b5-b70cd7b32bdf\") " pod="openstack-operators/openstack-operator-index-s7jn8" Mar 18 12:29:23 crc kubenswrapper[4975]: I0318 12:29:23.973548 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s7jn8" Mar 18 12:29:24 crc kubenswrapper[4975]: I0318 12:29:24.079041 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jj6vl" event={"ID":"fcac19af-9e0b-4294-b0f7-2037471e883a","Type":"ContainerStarted","Data":"56324af2822271b0716a6eb13a14ea02eb38b247c59429ea5cb8c4efd2ecd781"} Mar 18 12:29:24 crc kubenswrapper[4975]: I0318 12:29:24.079160 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-jj6vl" podUID="fcac19af-9e0b-4294-b0f7-2037471e883a" containerName="registry-server" containerID="cri-o://56324af2822271b0716a6eb13a14ea02eb38b247c59429ea5cb8c4efd2ecd781" gracePeriod=2 Mar 18 12:29:24 crc kubenswrapper[4975]: I0318 12:29:24.095946 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jj6vl" podStartSLOduration=2.409010819 podStartE2EDuration="5.095929031s" podCreationTimestamp="2026-03-18 12:29:19 +0000 UTC" firstStartedPulling="2026-03-18 12:29:20.38449162 +0000 UTC m=+1146.098892199" lastFinishedPulling="2026-03-18 12:29:23.071409832 +0000 UTC m=+1148.785810411" observedRunningTime="2026-03-18 12:29:24.093859534 +0000 UTC m=+1149.808260113" watchObservedRunningTime="2026-03-18 12:29:24.095929031 +0000 UTC m=+1149.810329610" Mar 18 12:29:24 crc kubenswrapper[4975]: I0318 12:29:24.189648 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s7jn8"] Mar 18 12:29:24 crc kubenswrapper[4975]: I0318 12:29:24.443287 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jj6vl" Mar 18 12:29:24 crc kubenswrapper[4975]: I0318 12:29:24.577273 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tmtz\" (UniqueName: \"kubernetes.io/projected/fcac19af-9e0b-4294-b0f7-2037471e883a-kube-api-access-6tmtz\") pod \"fcac19af-9e0b-4294-b0f7-2037471e883a\" (UID: \"fcac19af-9e0b-4294-b0f7-2037471e883a\") " Mar 18 12:29:24 crc kubenswrapper[4975]: I0318 12:29:24.581732 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcac19af-9e0b-4294-b0f7-2037471e883a-kube-api-access-6tmtz" (OuterVolumeSpecName: "kube-api-access-6tmtz") pod "fcac19af-9e0b-4294-b0f7-2037471e883a" (UID: "fcac19af-9e0b-4294-b0f7-2037471e883a"). InnerVolumeSpecName "kube-api-access-6tmtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:29:24 crc kubenswrapper[4975]: I0318 12:29:24.679114 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tmtz\" (UniqueName: \"kubernetes.io/projected/fcac19af-9e0b-4294-b0f7-2037471e883a-kube-api-access-6tmtz\") on node \"crc\" DevicePath \"\"" Mar 18 12:29:25 crc kubenswrapper[4975]: I0318 12:29:25.075309 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vqh9l" Mar 18 12:29:25 crc kubenswrapper[4975]: I0318 12:29:25.086496 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s7jn8" event={"ID":"3b9db44a-8c71-40d7-b3b5-b70cd7b32bdf","Type":"ContainerStarted","Data":"552ece88d7f3eebe77ed4c776a3d90e5b2b1ce4d9694d6adc22763bce77e3837"} Mar 18 12:29:25 crc kubenswrapper[4975]: I0318 12:29:25.086544 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s7jn8" event={"ID":"3b9db44a-8c71-40d7-b3b5-b70cd7b32bdf","Type":"ContainerStarted","Data":"8185c9ca9e43331d03ec5f497e5994aca9285b98b6a4ede4881dc324fbba6b4c"} Mar 18 12:29:25 crc kubenswrapper[4975]: I0318 12:29:25.089392 4975 generic.go:334] "Generic (PLEG): container finished" podID="fcac19af-9e0b-4294-b0f7-2037471e883a" containerID="56324af2822271b0716a6eb13a14ea02eb38b247c59429ea5cb8c4efd2ecd781" exitCode=0 Mar 18 12:29:25 crc kubenswrapper[4975]: I0318 12:29:25.089426 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jj6vl" Mar 18 12:29:25 crc kubenswrapper[4975]: I0318 12:29:25.089472 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jj6vl" event={"ID":"fcac19af-9e0b-4294-b0f7-2037471e883a","Type":"ContainerDied","Data":"56324af2822271b0716a6eb13a14ea02eb38b247c59429ea5cb8c4efd2ecd781"} Mar 18 12:29:25 crc kubenswrapper[4975]: I0318 12:29:25.089526 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jj6vl" event={"ID":"fcac19af-9e0b-4294-b0f7-2037471e883a","Type":"ContainerDied","Data":"3c26015132c6a63265e5f2b008910bd64b5a16f638705a243ada70e4b0dc161e"} Mar 18 12:29:25 crc kubenswrapper[4975]: I0318 12:29:25.089546 4975 scope.go:117] "RemoveContainer" containerID="56324af2822271b0716a6eb13a14ea02eb38b247c59429ea5cb8c4efd2ecd781" Mar 18 12:29:25 crc kubenswrapper[4975]: I0318 12:29:25.124667 4975 scope.go:117] "RemoveContainer" containerID="56324af2822271b0716a6eb13a14ea02eb38b247c59429ea5cb8c4efd2ecd781" Mar 18 12:29:25 crc kubenswrapper[4975]: E0318 12:29:25.131432 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56324af2822271b0716a6eb13a14ea02eb38b247c59429ea5cb8c4efd2ecd781\": container with ID starting with 56324af2822271b0716a6eb13a14ea02eb38b247c59429ea5cb8c4efd2ecd781 not found: ID does not exist" containerID="56324af2822271b0716a6eb13a14ea02eb38b247c59429ea5cb8c4efd2ecd781" Mar 18 12:29:25 crc kubenswrapper[4975]: I0318 12:29:25.131488 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56324af2822271b0716a6eb13a14ea02eb38b247c59429ea5cb8c4efd2ecd781"} err="failed to get container status \"56324af2822271b0716a6eb13a14ea02eb38b247c59429ea5cb8c4efd2ecd781\": rpc error: code = NotFound desc = could not find container \"56324af2822271b0716a6eb13a14ea02eb38b247c59429ea5cb8c4efd2ecd781\": container with ID starting with 56324af2822271b0716a6eb13a14ea02eb38b247c59429ea5cb8c4efd2ecd781 not found: ID does not exist" Mar 18 12:29:25 crc kubenswrapper[4975]: I0318 12:29:25.135491 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-s7jn8" podStartSLOduration=2.061707177 podStartE2EDuration="2.13546483s" podCreationTimestamp="2026-03-18 12:29:23 +0000 UTC" firstStartedPulling="2026-03-18 12:29:24.195812078 +0000 UTC m=+1149.910212657" lastFinishedPulling="2026-03-18 12:29:24.269569731 +0000 UTC m=+1149.983970310" observedRunningTime="2026-03-18 12:29:25.125370205 +0000 UTC m=+1150.839770804" watchObservedRunningTime="2026-03-18 12:29:25.13546483 +0000 UTC m=+1150.849865409" Mar 18 12:29:25 crc kubenswrapper[4975]: I0318 12:29:25.140485 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jj6vl"] Mar 18 12:29:25 crc kubenswrapper[4975]: I0318 12:29:25.144128 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-jj6vl"] Mar 18 12:29:25 crc kubenswrapper[4975]: I0318 12:29:25.538998 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:29:25 crc kubenswrapper[4975]: I0318 12:29:25.539056 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:29:27 crc kubenswrapper[4975]: I0318 12:29:27.023922 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcac19af-9e0b-4294-b0f7-2037471e883a" path="/var/lib/kubelet/pods/fcac19af-9e0b-4294-b0f7-2037471e883a/volumes" Mar 18 12:29:33 crc kubenswrapper[4975]: I0318 12:29:33.974186 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-s7jn8" Mar 18 12:29:33 crc kubenswrapper[4975]: I0318 12:29:33.974525 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-s7jn8" Mar 18 12:29:34 crc kubenswrapper[4975]: I0318 12:29:34.006805 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-s7jn8" Mar 18 12:29:34 crc kubenswrapper[4975]: I0318 12:29:34.170218 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-s7jn8" Mar 18 12:29:35 crc kubenswrapper[4975]: I0318 12:29:35.084608 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-pslqs" Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.656407 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg"] Mar 18 12:29:41 crc kubenswrapper[4975]: E0318 12:29:41.657838 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcac19af-9e0b-4294-b0f7-2037471e883a" containerName="registry-server" Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.657857 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcac19af-9e0b-4294-b0f7-2037471e883a" containerName="registry-server" Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.658032 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcac19af-9e0b-4294-b0f7-2037471e883a" containerName="registry-server" Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.659329 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.662338 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-l5jhn" Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.664109 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg"] Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.814412 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38e210dc-b7c2-447c-9c0c-324bc2c5176a-util\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg\" (UID: \"38e210dc-b7c2-447c-9c0c-324bc2c5176a\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.814460 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r965f\" (UniqueName: \"kubernetes.io/projected/38e210dc-b7c2-447c-9c0c-324bc2c5176a-kube-api-access-r965f\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg\" (UID: \"38e210dc-b7c2-447c-9c0c-324bc2c5176a\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.814497 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38e210dc-b7c2-447c-9c0c-324bc2c5176a-bundle\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg\" (UID: \"38e210dc-b7c2-447c-9c0c-324bc2c5176a\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.916304 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38e210dc-b7c2-447c-9c0c-324bc2c5176a-util\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg\" (UID: \"38e210dc-b7c2-447c-9c0c-324bc2c5176a\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.916409 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r965f\" (UniqueName: \"kubernetes.io/projected/38e210dc-b7c2-447c-9c0c-324bc2c5176a-kube-api-access-r965f\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg\" (UID: \"38e210dc-b7c2-447c-9c0c-324bc2c5176a\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.916508 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38e210dc-b7c2-447c-9c0c-324bc2c5176a-bundle\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg\" (UID: \"38e210dc-b7c2-447c-9c0c-324bc2c5176a\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.917317 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38e210dc-b7c2-447c-9c0c-324bc2c5176a-util\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg\" (UID: \"38e210dc-b7c2-447c-9c0c-324bc2c5176a\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.917485 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38e210dc-b7c2-447c-9c0c-324bc2c5176a-bundle\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg\" (UID: \"38e210dc-b7c2-447c-9c0c-324bc2c5176a\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.943179 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r965f\" (UniqueName: \"kubernetes.io/projected/38e210dc-b7c2-447c-9c0c-324bc2c5176a-kube-api-access-r965f\") pod \"0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg\" (UID: \"38e210dc-b7c2-447c-9c0c-324bc2c5176a\") " pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" Mar 18 12:29:41 crc kubenswrapper[4975]: I0318 12:29:41.975460 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" Mar 18 12:29:42 crc kubenswrapper[4975]: I0318 12:29:42.431911 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg"] Mar 18 12:29:43 crc kubenswrapper[4975]: I0318 12:29:43.240368 4975 generic.go:334] "Generic (PLEG): container finished" podID="38e210dc-b7c2-447c-9c0c-324bc2c5176a" containerID="6e948b33ac9a2580c44a79a13553107824b8b9bc6ba19b47024e28fe7d6e294a" exitCode=0 Mar 18 12:29:43 crc kubenswrapper[4975]: I0318 12:29:43.240426 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" event={"ID":"38e210dc-b7c2-447c-9c0c-324bc2c5176a","Type":"ContainerDied","Data":"6e948b33ac9a2580c44a79a13553107824b8b9bc6ba19b47024e28fe7d6e294a"} Mar 18 12:29:43 crc kubenswrapper[4975]: I0318 12:29:43.240640 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" event={"ID":"38e210dc-b7c2-447c-9c0c-324bc2c5176a","Type":"ContainerStarted","Data":"d015b425f6d4633798b923004f2909277fdc1e2d3ee21b65601dd593587999f5"} Mar 18 12:29:44 crc kubenswrapper[4975]: I0318 12:29:44.251411 4975 generic.go:334] "Generic (PLEG): container finished" podID="38e210dc-b7c2-447c-9c0c-324bc2c5176a" containerID="bbd79239b752079f80f6e3ae2f85b6c23fe56faa6f7f6c4a1ddd86e08a48e4cb" exitCode=0 Mar 18 12:29:44 crc kubenswrapper[4975]: I0318 12:29:44.251515 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" event={"ID":"38e210dc-b7c2-447c-9c0c-324bc2c5176a","Type":"ContainerDied","Data":"bbd79239b752079f80f6e3ae2f85b6c23fe56faa6f7f6c4a1ddd86e08a48e4cb"} Mar 18 12:29:45 crc kubenswrapper[4975]: I0318 12:29:45.258907 4975 generic.go:334] "Generic (PLEG): container finished" podID="38e210dc-b7c2-447c-9c0c-324bc2c5176a" containerID="2da12f4f20a38f476f5ed7b3e2af34b9b7eee559ee5831d33045bb0ed16a1157" exitCode=0 Mar 18 12:29:45 crc kubenswrapper[4975]: I0318 12:29:45.258960 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" event={"ID":"38e210dc-b7c2-447c-9c0c-324bc2c5176a","Type":"ContainerDied","Data":"2da12f4f20a38f476f5ed7b3e2af34b9b7eee559ee5831d33045bb0ed16a1157"} Mar 18 12:29:46 crc kubenswrapper[4975]: I0318 12:29:46.505768 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" Mar 18 12:29:46 crc kubenswrapper[4975]: I0318 12:29:46.590741 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38e210dc-b7c2-447c-9c0c-324bc2c5176a-bundle\") pod \"38e210dc-b7c2-447c-9c0c-324bc2c5176a\" (UID: \"38e210dc-b7c2-447c-9c0c-324bc2c5176a\") " Mar 18 12:29:46 crc kubenswrapper[4975]: I0318 12:29:46.591112 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r965f\" (UniqueName: \"kubernetes.io/projected/38e210dc-b7c2-447c-9c0c-324bc2c5176a-kube-api-access-r965f\") pod \"38e210dc-b7c2-447c-9c0c-324bc2c5176a\" (UID: \"38e210dc-b7c2-447c-9c0c-324bc2c5176a\") " Mar 18 12:29:46 crc kubenswrapper[4975]: I0318 12:29:46.591159 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38e210dc-b7c2-447c-9c0c-324bc2c5176a-util\") pod \"38e210dc-b7c2-447c-9c0c-324bc2c5176a\" (UID: \"38e210dc-b7c2-447c-9c0c-324bc2c5176a\") " Mar 18 12:29:46 crc kubenswrapper[4975]: I0318 12:29:46.591993 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e210dc-b7c2-447c-9c0c-324bc2c5176a-bundle" (OuterVolumeSpecName: "bundle") pod "38e210dc-b7c2-447c-9c0c-324bc2c5176a" (UID: "38e210dc-b7c2-447c-9c0c-324bc2c5176a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:29:46 crc kubenswrapper[4975]: I0318 12:29:46.602049 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38e210dc-b7c2-447c-9c0c-324bc2c5176a-kube-api-access-r965f" (OuterVolumeSpecName: "kube-api-access-r965f") pod "38e210dc-b7c2-447c-9c0c-324bc2c5176a" (UID: "38e210dc-b7c2-447c-9c0c-324bc2c5176a"). InnerVolumeSpecName "kube-api-access-r965f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:29:46 crc kubenswrapper[4975]: I0318 12:29:46.605238 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e210dc-b7c2-447c-9c0c-324bc2c5176a-util" (OuterVolumeSpecName: "util") pod "38e210dc-b7c2-447c-9c0c-324bc2c5176a" (UID: "38e210dc-b7c2-447c-9c0c-324bc2c5176a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:29:46 crc kubenswrapper[4975]: I0318 12:29:46.692470 4975 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38e210dc-b7c2-447c-9c0c-324bc2c5176a-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:29:46 crc kubenswrapper[4975]: I0318 12:29:46.692514 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r965f\" (UniqueName: \"kubernetes.io/projected/38e210dc-b7c2-447c-9c0c-324bc2c5176a-kube-api-access-r965f\") on node \"crc\" DevicePath \"\"" Mar 18 12:29:46 crc kubenswrapper[4975]: I0318 12:29:46.692529 4975 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38e210dc-b7c2-447c-9c0c-324bc2c5176a-util\") on node \"crc\" DevicePath \"\"" Mar 18 12:29:47 crc kubenswrapper[4975]: I0318 12:29:47.272994 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" event={"ID":"38e210dc-b7c2-447c-9c0c-324bc2c5176a","Type":"ContainerDied","Data":"d015b425f6d4633798b923004f2909277fdc1e2d3ee21b65601dd593587999f5"} Mar 18 12:29:47 crc kubenswrapper[4975]: I0318 12:29:47.273072 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d015b425f6d4633798b923004f2909277fdc1e2d3ee21b65601dd593587999f5" Mar 18 12:29:47 crc kubenswrapper[4975]: I0318 12:29:47.273110 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg" Mar 18 12:29:53 crc kubenswrapper[4975]: I0318 12:29:53.202434 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-68ccf9867-9mxng"] Mar 18 12:29:53 crc kubenswrapper[4975]: E0318 12:29:53.202984 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38e210dc-b7c2-447c-9c0c-324bc2c5176a" containerName="util" Mar 18 12:29:53 crc kubenswrapper[4975]: I0318 12:29:53.202995 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e210dc-b7c2-447c-9c0c-324bc2c5176a" containerName="util" Mar 18 12:29:53 crc kubenswrapper[4975]: E0318 12:29:53.203005 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38e210dc-b7c2-447c-9c0c-324bc2c5176a" containerName="pull" Mar 18 12:29:53 crc kubenswrapper[4975]: I0318 12:29:53.203010 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e210dc-b7c2-447c-9c0c-324bc2c5176a" containerName="pull" Mar 18 12:29:53 crc kubenswrapper[4975]: E0318 12:29:53.203026 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38e210dc-b7c2-447c-9c0c-324bc2c5176a" containerName="extract" Mar 18 12:29:53 crc kubenswrapper[4975]: I0318 12:29:53.203032 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e210dc-b7c2-447c-9c0c-324bc2c5176a" containerName="extract" Mar 18 12:29:53 crc kubenswrapper[4975]: I0318 12:29:53.203143 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="38e210dc-b7c2-447c-9c0c-324bc2c5176a" containerName="extract" Mar 18 12:29:53 crc kubenswrapper[4975]: I0318 12:29:53.203506 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-9mxng" Mar 18 12:29:53 crc kubenswrapper[4975]: I0318 12:29:53.205798 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-x9wxb" Mar 18 12:29:53 crc kubenswrapper[4975]: I0318 12:29:53.230495 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-68ccf9867-9mxng"] Mar 18 12:29:53 crc kubenswrapper[4975]: I0318 12:29:53.376752 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbl68\" (UniqueName: \"kubernetes.io/projected/a4d01e56-0d21-4c32-9f31-a1adc02598db-kube-api-access-rbl68\") pod \"openstack-operator-controller-init-68ccf9867-9mxng\" (UID: \"a4d01e56-0d21-4c32-9f31-a1adc02598db\") " pod="openstack-operators/openstack-operator-controller-init-68ccf9867-9mxng" Mar 18 12:29:53 crc kubenswrapper[4975]: I0318 12:29:53.477720 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbl68\" (UniqueName: \"kubernetes.io/projected/a4d01e56-0d21-4c32-9f31-a1adc02598db-kube-api-access-rbl68\") pod \"openstack-operator-controller-init-68ccf9867-9mxng\" (UID: \"a4d01e56-0d21-4c32-9f31-a1adc02598db\") " pod="openstack-operators/openstack-operator-controller-init-68ccf9867-9mxng" Mar 18 12:29:53 crc kubenswrapper[4975]: I0318 12:29:53.496523 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbl68\" (UniqueName: \"kubernetes.io/projected/a4d01e56-0d21-4c32-9f31-a1adc02598db-kube-api-access-rbl68\") pod \"openstack-operator-controller-init-68ccf9867-9mxng\" (UID: \"a4d01e56-0d21-4c32-9f31-a1adc02598db\") " pod="openstack-operators/openstack-operator-controller-init-68ccf9867-9mxng" Mar 18 12:29:53 crc kubenswrapper[4975]: I0318 12:29:53.523081 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-9mxng" Mar 18 12:29:53 crc kubenswrapper[4975]: I0318 12:29:53.837582 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-68ccf9867-9mxng"] Mar 18 12:29:53 crc kubenswrapper[4975]: W0318 12:29:53.843670 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4d01e56_0d21_4c32_9f31_a1adc02598db.slice/crio-d544f2325a6fbea15524f4a6cfefe4fa73d8028703f8551e5161ea7fc2a5206e WatchSource:0}: Error finding container d544f2325a6fbea15524f4a6cfefe4fa73d8028703f8551e5161ea7fc2a5206e: Status 404 returned error can't find the container with id d544f2325a6fbea15524f4a6cfefe4fa73d8028703f8551e5161ea7fc2a5206e Mar 18 12:29:54 crc kubenswrapper[4975]: I0318 12:29:54.326931 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-9mxng" event={"ID":"a4d01e56-0d21-4c32-9f31-a1adc02598db","Type":"ContainerStarted","Data":"d544f2325a6fbea15524f4a6cfefe4fa73d8028703f8551e5161ea7fc2a5206e"} Mar 18 12:29:55 crc kubenswrapper[4975]: I0318 12:29:55.544669 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:29:55 crc kubenswrapper[4975]: I0318 12:29:55.545016 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:29:55 crc kubenswrapper[4975]: I0318 12:29:55.545068 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:29:55 crc kubenswrapper[4975]: I0318 12:29:55.545880 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b3ad49f3300a39909733b143700abc28ad83ea2ad2f5fc6a9b69e95819adb98f"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:29:55 crc kubenswrapper[4975]: I0318 12:29:55.545959 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://b3ad49f3300a39909733b143700abc28ad83ea2ad2f5fc6a9b69e95819adb98f" gracePeriod=600 Mar 18 12:29:56 crc kubenswrapper[4975]: I0318 12:29:56.566310 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="b3ad49f3300a39909733b143700abc28ad83ea2ad2f5fc6a9b69e95819adb98f" exitCode=0 Mar 18 12:29:56 crc kubenswrapper[4975]: I0318 12:29:56.566392 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"b3ad49f3300a39909733b143700abc28ad83ea2ad2f5fc6a9b69e95819adb98f"} Mar 18 12:29:56 crc kubenswrapper[4975]: I0318 12:29:56.566489 4975 scope.go:117] "RemoveContainer" containerID="142b541a7de05fd46269c85cd3392d764eb097aeaa954b82530c1118b45a05b8" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.150998 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563950-4n242"] Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.153303 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563950-4n242" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.162016 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d"] Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.162516 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.162812 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.163094 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.163488 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.171770 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.171932 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.178231 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563950-4n242"] Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.188950 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d"] Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.205593 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/565e8dc1-44b2-4dd9-9653-950d50c6d914-secret-volume\") pod \"collect-profiles-29563950-26v6d\" (UID: \"565e8dc1-44b2-4dd9-9653-950d50c6d914\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.205670 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/565e8dc1-44b2-4dd9-9653-950d50c6d914-config-volume\") pod \"collect-profiles-29563950-26v6d\" (UID: \"565e8dc1-44b2-4dd9-9653-950d50c6d914\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.205723 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v247\" (UniqueName: \"kubernetes.io/projected/565e8dc1-44b2-4dd9-9653-950d50c6d914-kube-api-access-8v247\") pod \"collect-profiles-29563950-26v6d\" (UID: \"565e8dc1-44b2-4dd9-9653-950d50c6d914\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.205754 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np2hw\" (UniqueName: \"kubernetes.io/projected/28462947-5e88-43df-a70b-1c4e5b99215c-kube-api-access-np2hw\") pod \"auto-csr-approver-29563950-4n242\" (UID: \"28462947-5e88-43df-a70b-1c4e5b99215c\") " pod="openshift-infra/auto-csr-approver-29563950-4n242" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.306615 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/565e8dc1-44b2-4dd9-9653-950d50c6d914-secret-volume\") pod \"collect-profiles-29563950-26v6d\" (UID: \"565e8dc1-44b2-4dd9-9653-950d50c6d914\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.306681 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/565e8dc1-44b2-4dd9-9653-950d50c6d914-config-volume\") pod \"collect-profiles-29563950-26v6d\" (UID: \"565e8dc1-44b2-4dd9-9653-950d50c6d914\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.306726 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v247\" (UniqueName: \"kubernetes.io/projected/565e8dc1-44b2-4dd9-9653-950d50c6d914-kube-api-access-8v247\") pod \"collect-profiles-29563950-26v6d\" (UID: \"565e8dc1-44b2-4dd9-9653-950d50c6d914\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.306762 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np2hw\" (UniqueName: \"kubernetes.io/projected/28462947-5e88-43df-a70b-1c4e5b99215c-kube-api-access-np2hw\") pod \"auto-csr-approver-29563950-4n242\" (UID: \"28462947-5e88-43df-a70b-1c4e5b99215c\") " pod="openshift-infra/auto-csr-approver-29563950-4n242" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.307858 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/565e8dc1-44b2-4dd9-9653-950d50c6d914-config-volume\") pod \"collect-profiles-29563950-26v6d\" (UID: \"565e8dc1-44b2-4dd9-9653-950d50c6d914\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.318231 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/565e8dc1-44b2-4dd9-9653-950d50c6d914-secret-volume\") pod \"collect-profiles-29563950-26v6d\" (UID: \"565e8dc1-44b2-4dd9-9653-950d50c6d914\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.330645 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np2hw\" (UniqueName: \"kubernetes.io/projected/28462947-5e88-43df-a70b-1c4e5b99215c-kube-api-access-np2hw\") pod \"auto-csr-approver-29563950-4n242\" (UID: \"28462947-5e88-43df-a70b-1c4e5b99215c\") " pod="openshift-infra/auto-csr-approver-29563950-4n242" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.343669 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v247\" (UniqueName: \"kubernetes.io/projected/565e8dc1-44b2-4dd9-9653-950d50c6d914-kube-api-access-8v247\") pod \"collect-profiles-29563950-26v6d\" (UID: \"565e8dc1-44b2-4dd9-9653-950d50c6d914\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.469613 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563950-4n242" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.496467 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.602393 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"d846cb3e61bc67fa3212660cebeebcacd3a57cbf2e5bcba7bd344d98d42cef45"} Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.605632 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-9mxng" event={"ID":"a4d01e56-0d21-4c32-9f31-a1adc02598db","Type":"ContainerStarted","Data":"fb29614ab558b84b6ef826c8fff0c84a8a1b4808c50e6278dde615b9827c4b28"} Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.606258 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-9mxng" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.661535 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-9mxng" podStartSLOduration=1.744979504 podStartE2EDuration="7.661514793s" podCreationTimestamp="2026-03-18 12:29:53 +0000 UTC" firstStartedPulling="2026-03-18 12:29:53.846365851 +0000 UTC m=+1179.560766430" lastFinishedPulling="2026-03-18 12:29:59.76290114 +0000 UTC m=+1185.477301719" observedRunningTime="2026-03-18 12:30:00.657995487 +0000 UTC m=+1186.372396076" watchObservedRunningTime="2026-03-18 12:30:00.661514793 +0000 UTC m=+1186.375915372" Mar 18 12:30:00 crc kubenswrapper[4975]: I0318 12:30:00.965106 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563950-4n242"] Mar 18 12:30:01 crc kubenswrapper[4975]: I0318 12:30:01.070137 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d"] Mar 18 12:30:01 crc kubenswrapper[4975]: W0318 12:30:01.075151 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod565e8dc1_44b2_4dd9_9653_950d50c6d914.slice/crio-50d1d652dd03922660c82aa45d5929466767f8604163104f48409d29fc495fec WatchSource:0}: Error finding container 50d1d652dd03922660c82aa45d5929466767f8604163104f48409d29fc495fec: Status 404 returned error can't find the container with id 50d1d652dd03922660c82aa45d5929466767f8604163104f48409d29fc495fec Mar 18 12:30:01 crc kubenswrapper[4975]: I0318 12:30:01.616308 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563950-4n242" event={"ID":"28462947-5e88-43df-a70b-1c4e5b99215c","Type":"ContainerStarted","Data":"44a84bd6e38dd96e87cc6d47b85d688ef75691d412af7437728ac42d2fba68ae"} Mar 18 12:30:01 crc kubenswrapper[4975]: I0318 12:30:01.618428 4975 generic.go:334] "Generic (PLEG): container finished" podID="565e8dc1-44b2-4dd9-9653-950d50c6d914" containerID="e6b17942904af52284aa23e5b388e902ee8ecc96432b3da5e06998c86a351b9d" exitCode=0 Mar 18 12:30:01 crc kubenswrapper[4975]: I0318 12:30:01.618785 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" event={"ID":"565e8dc1-44b2-4dd9-9653-950d50c6d914","Type":"ContainerDied","Data":"e6b17942904af52284aa23e5b388e902ee8ecc96432b3da5e06998c86a351b9d"} Mar 18 12:30:01 crc kubenswrapper[4975]: I0318 12:30:01.618830 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" event={"ID":"565e8dc1-44b2-4dd9-9653-950d50c6d914","Type":"ContainerStarted","Data":"50d1d652dd03922660c82aa45d5929466767f8604163104f48409d29fc495fec"} Mar 18 12:30:02 crc kubenswrapper[4975]: I0318 12:30:02.863278 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" Mar 18 12:30:03 crc kubenswrapper[4975]: I0318 12:30:03.038756 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/565e8dc1-44b2-4dd9-9653-950d50c6d914-config-volume\") pod \"565e8dc1-44b2-4dd9-9653-950d50c6d914\" (UID: \"565e8dc1-44b2-4dd9-9653-950d50c6d914\") " Mar 18 12:30:03 crc kubenswrapper[4975]: I0318 12:30:03.038834 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v247\" (UniqueName: \"kubernetes.io/projected/565e8dc1-44b2-4dd9-9653-950d50c6d914-kube-api-access-8v247\") pod \"565e8dc1-44b2-4dd9-9653-950d50c6d914\" (UID: \"565e8dc1-44b2-4dd9-9653-950d50c6d914\") " Mar 18 12:30:03 crc kubenswrapper[4975]: I0318 12:30:03.039024 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/565e8dc1-44b2-4dd9-9653-950d50c6d914-secret-volume\") pod \"565e8dc1-44b2-4dd9-9653-950d50c6d914\" (UID: \"565e8dc1-44b2-4dd9-9653-950d50c6d914\") " Mar 18 12:30:03 crc kubenswrapper[4975]: I0318 12:30:03.039624 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565e8dc1-44b2-4dd9-9653-950d50c6d914-config-volume" (OuterVolumeSpecName: "config-volume") pod "565e8dc1-44b2-4dd9-9653-950d50c6d914" (UID: "565e8dc1-44b2-4dd9-9653-950d50c6d914"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:30:03 crc kubenswrapper[4975]: I0318 12:30:03.044016 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565e8dc1-44b2-4dd9-9653-950d50c6d914-kube-api-access-8v247" (OuterVolumeSpecName: "kube-api-access-8v247") pod "565e8dc1-44b2-4dd9-9653-950d50c6d914" (UID: "565e8dc1-44b2-4dd9-9653-950d50c6d914"). InnerVolumeSpecName "kube-api-access-8v247". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:03 crc kubenswrapper[4975]: I0318 12:30:03.044029 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565e8dc1-44b2-4dd9-9653-950d50c6d914-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "565e8dc1-44b2-4dd9-9653-950d50c6d914" (UID: "565e8dc1-44b2-4dd9-9653-950d50c6d914"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:30:03 crc kubenswrapper[4975]: I0318 12:30:03.140195 4975 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/565e8dc1-44b2-4dd9-9653-950d50c6d914-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:03 crc kubenswrapper[4975]: I0318 12:30:03.140223 4975 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/565e8dc1-44b2-4dd9-9653-950d50c6d914-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:03 crc kubenswrapper[4975]: I0318 12:30:03.140232 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v247\" (UniqueName: \"kubernetes.io/projected/565e8dc1-44b2-4dd9-9653-950d50c6d914-kube-api-access-8v247\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:03 crc kubenswrapper[4975]: I0318 12:30:03.630331 4975 generic.go:334] "Generic (PLEG): container finished" podID="28462947-5e88-43df-a70b-1c4e5b99215c" containerID="cdfad74cb6dd30d81beb5cfd74d9b07938416a0d9d2fcf09ea8dbfd21aa40be4" exitCode=0 Mar 18 12:30:03 crc kubenswrapper[4975]: I0318 12:30:03.630398 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563950-4n242" event={"ID":"28462947-5e88-43df-a70b-1c4e5b99215c","Type":"ContainerDied","Data":"cdfad74cb6dd30d81beb5cfd74d9b07938416a0d9d2fcf09ea8dbfd21aa40be4"} Mar 18 12:30:03 crc kubenswrapper[4975]: I0318 12:30:03.631623 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" event={"ID":"565e8dc1-44b2-4dd9-9653-950d50c6d914","Type":"ContainerDied","Data":"50d1d652dd03922660c82aa45d5929466767f8604163104f48409d29fc495fec"} Mar 18 12:30:03 crc kubenswrapper[4975]: I0318 12:30:03.631650 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50d1d652dd03922660c82aa45d5929466767f8604163104f48409d29fc495fec" Mar 18 12:30:03 crc kubenswrapper[4975]: I0318 12:30:03.631662 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d" Mar 18 12:30:04 crc kubenswrapper[4975]: I0318 12:30:04.853700 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563950-4n242" Mar 18 12:30:04 crc kubenswrapper[4975]: I0318 12:30:04.986924 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np2hw\" (UniqueName: \"kubernetes.io/projected/28462947-5e88-43df-a70b-1c4e5b99215c-kube-api-access-np2hw\") pod \"28462947-5e88-43df-a70b-1c4e5b99215c\" (UID: \"28462947-5e88-43df-a70b-1c4e5b99215c\") " Mar 18 12:30:04 crc kubenswrapper[4975]: I0318 12:30:04.991390 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28462947-5e88-43df-a70b-1c4e5b99215c-kube-api-access-np2hw" (OuterVolumeSpecName: "kube-api-access-np2hw") pod "28462947-5e88-43df-a70b-1c4e5b99215c" (UID: "28462947-5e88-43df-a70b-1c4e5b99215c"). InnerVolumeSpecName "kube-api-access-np2hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:30:05 crc kubenswrapper[4975]: I0318 12:30:05.089504 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np2hw\" (UniqueName: \"kubernetes.io/projected/28462947-5e88-43df-a70b-1c4e5b99215c-kube-api-access-np2hw\") on node \"crc\" DevicePath \"\"" Mar 18 12:30:05 crc kubenswrapper[4975]: I0318 12:30:05.642656 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563950-4n242" event={"ID":"28462947-5e88-43df-a70b-1c4e5b99215c","Type":"ContainerDied","Data":"44a84bd6e38dd96e87cc6d47b85d688ef75691d412af7437728ac42d2fba68ae"} Mar 18 12:30:05 crc kubenswrapper[4975]: I0318 12:30:05.642699 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44a84bd6e38dd96e87cc6d47b85d688ef75691d412af7437728ac42d2fba68ae" Mar 18 12:30:05 crc kubenswrapper[4975]: I0318 12:30:05.642759 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563950-4n242" Mar 18 12:30:05 crc kubenswrapper[4975]: I0318 12:30:05.915254 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563944-xm5fl"] Mar 18 12:30:05 crc kubenswrapper[4975]: I0318 12:30:05.919697 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563944-xm5fl"] Mar 18 12:30:07 crc kubenswrapper[4975]: I0318 12:30:07.025629 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faf85139-fd24-430e-a781-054357d8c8dc" path="/var/lib/kubelet/pods/faf85139-fd24-430e-a781-054357d8c8dc/volumes" Mar 18 12:30:13 crc kubenswrapper[4975]: I0318 12:30:13.526679 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-68ccf9867-9mxng" Mar 18 12:30:33 crc kubenswrapper[4975]: I0318 12:30:33.930009 4975 scope.go:117] "RemoveContainer" containerID="69ed0d22369b45cd29e2c7d7596fb5dd276e530a132ce4e064441a91a32c3fc8" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.513888 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-9hd99"] Mar 18 12:30:42 crc kubenswrapper[4975]: E0318 12:30:42.514927 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565e8dc1-44b2-4dd9-9653-950d50c6d914" containerName="collect-profiles" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.514940 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="565e8dc1-44b2-4dd9-9653-950d50c6d914" containerName="collect-profiles" Mar 18 12:30:42 crc kubenswrapper[4975]: E0318 12:30:42.514984 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28462947-5e88-43df-a70b-1c4e5b99215c" containerName="oc" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.514992 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="28462947-5e88-43df-a70b-1c4e5b99215c" containerName="oc" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.515254 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="28462947-5e88-43df-a70b-1c4e5b99215c" containerName="oc" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.515298 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="565e8dc1-44b2-4dd9-9653-950d50c6d914" containerName="collect-profiles" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.517615 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9hd99" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.530642 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-t79fc" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.564228 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-7lzk9"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.571416 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-dp5dh"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.575487 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7lzk9" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.576732 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-dp5dh" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.580280 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-lw8mn" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.580571 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-qwcnf" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.590026 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-9hd99"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.596022 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-7lzk9"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.618776 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fw2v\" (UniqueName: \"kubernetes.io/projected/84bd8990-a50a-4fb2-88d3-e3141ef24b7d-kube-api-access-7fw2v\") pod \"cinder-operator-controller-manager-8d58dc466-7lzk9\" (UID: \"84bd8990-a50a-4fb2-88d3-e3141ef24b7d\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7lzk9" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.629795 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-dp5dh"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.634833 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-wjt2j"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.635628 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wjt2j" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.639239 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-v8749" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.643244 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-v4ngr"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.644284 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v4ngr" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.651381 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-tx6cd" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.671476 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-wjt2j"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.691983 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-v4ngr"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.704053 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-jthrm"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.705034 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jthrm" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.712226 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-79rbf" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.717529 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.718554 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.719897 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdh5w\" (UniqueName: \"kubernetes.io/projected/675b1757-67bf-4d6d-9947-31a4da13a1be-kube-api-access-cdh5w\") pod \"designate-operator-controller-manager-588d4d986b-dp5dh\" (UID: \"675b1757-67bf-4d6d-9947-31a4da13a1be\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-dp5dh" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.719940 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fw2v\" (UniqueName: \"kubernetes.io/projected/84bd8990-a50a-4fb2-88d3-e3141ef24b7d-kube-api-access-7fw2v\") pod \"cinder-operator-controller-manager-8d58dc466-7lzk9\" (UID: \"84bd8990-a50a-4fb2-88d3-e3141ef24b7d\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7lzk9" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.720012 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-469jt\" (UniqueName: \"kubernetes.io/projected/4b5bf68a-b7e7-4df7-ab69-2b8eac9c1091-kube-api-access-469jt\") pod \"barbican-operator-controller-manager-59bc569d95-9hd99\" (UID: \"4b5bf68a-b7e7-4df7-ab69-2b8eac9c1091\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9hd99" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.721494 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.721581 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-ttk54" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.736234 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-jthrm"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.773968 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.791951 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-hqgd2"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.793203 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-hqgd2" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.795242 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-hqgd2"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.806245 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-25vhb" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.812923 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fw2v\" (UniqueName: \"kubernetes.io/projected/84bd8990-a50a-4fb2-88d3-e3141ef24b7d-kube-api-access-7fw2v\") pod \"cinder-operator-controller-manager-8d58dc466-7lzk9\" (UID: \"84bd8990-a50a-4fb2-88d3-e3141ef24b7d\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7lzk9" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.820726 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkrkj\" (UniqueName: \"kubernetes.io/projected/14e0c597-a515-4b44-908e-3737f385d7c3-kube-api-access-jkrkj\") pod \"heat-operator-controller-manager-67dd5f86f5-v4ngr\" (UID: \"14e0c597-a515-4b44-908e-3737f385d7c3\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v4ngr" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.820800 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdh5w\" (UniqueName: \"kubernetes.io/projected/675b1757-67bf-4d6d-9947-31a4da13a1be-kube-api-access-cdh5w\") pod \"designate-operator-controller-manager-588d4d986b-dp5dh\" (UID: \"675b1757-67bf-4d6d-9947-31a4da13a1be\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-dp5dh" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.820844 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dkmp\" (UniqueName: \"kubernetes.io/projected/1b122375-f65a-4f05-a738-41eab6a8fcd3-kube-api-access-5dkmp\") pod \"glance-operator-controller-manager-79df6bcc97-wjt2j\" (UID: \"1b122375-f65a-4f05-a738-41eab6a8fcd3\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wjt2j" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.820891 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm2bt\" (UniqueName: \"kubernetes.io/projected/2c3688cf-2e2c-434c-88a7-10ac1a4949b2-kube-api-access-lm2bt\") pod \"horizon-operator-controller-manager-8464cc45fb-jthrm\" (UID: \"2c3688cf-2e2c-434c-88a7-10ac1a4949b2\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jthrm" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.820923 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert\") pod \"infra-operator-controller-manager-7b9c774f96-tfkfx\" (UID: \"d854c129-c4eb-4c08-a398-3549f4ff9047\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.820967 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-469jt\" (UniqueName: \"kubernetes.io/projected/4b5bf68a-b7e7-4df7-ab69-2b8eac9c1091-kube-api-access-469jt\") pod \"barbican-operator-controller-manager-59bc569d95-9hd99\" (UID: \"4b5bf68a-b7e7-4df7-ab69-2b8eac9c1091\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9hd99" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.821004 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6j9d\" (UniqueName: \"kubernetes.io/projected/d854c129-c4eb-4c08-a398-3549f4ff9047-kube-api-access-f6j9d\") pod \"infra-operator-controller-manager-7b9c774f96-tfkfx\" (UID: \"d854c129-c4eb-4c08-a398-3549f4ff9047\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.824925 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-lcsc9"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.826056 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-lcsc9" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.828995 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-79hrn" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.837364 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-qhnk6"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.838342 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-qhnk6" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.840147 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-6bgqs" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.845946 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-lcsc9"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.855270 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-469jt\" (UniqueName: \"kubernetes.io/projected/4b5bf68a-b7e7-4df7-ab69-2b8eac9c1091-kube-api-access-469jt\") pod \"barbican-operator-controller-manager-59bc569d95-9hd99\" (UID: \"4b5bf68a-b7e7-4df7-ab69-2b8eac9c1091\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9hd99" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.858016 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdh5w\" (UniqueName: \"kubernetes.io/projected/675b1757-67bf-4d6d-9947-31a4da13a1be-kube-api-access-cdh5w\") pod \"designate-operator-controller-manager-588d4d986b-dp5dh\" (UID: \"675b1757-67bf-4d6d-9947-31a4da13a1be\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-dp5dh" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.869430 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.870469 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.881413 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-qkrvk" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.885235 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9hd99" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.908124 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-qhnk6"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.918562 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7lzk9" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.923890 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dkmp\" (UniqueName: \"kubernetes.io/projected/1b122375-f65a-4f05-a738-41eab6a8fcd3-kube-api-access-5dkmp\") pod \"glance-operator-controller-manager-79df6bcc97-wjt2j\" (UID: \"1b122375-f65a-4f05-a738-41eab6a8fcd3\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wjt2j" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.924570 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm2bt\" (UniqueName: \"kubernetes.io/projected/2c3688cf-2e2c-434c-88a7-10ac1a4949b2-kube-api-access-lm2bt\") pod \"horizon-operator-controller-manager-8464cc45fb-jthrm\" (UID: \"2c3688cf-2e2c-434c-88a7-10ac1a4949b2\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jthrm" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.924644 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert\") pod \"infra-operator-controller-manager-7b9c774f96-tfkfx\" (UID: \"d854c129-c4eb-4c08-a398-3549f4ff9047\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.924726 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr847\" (UniqueName: \"kubernetes.io/projected/4509daad-a22e-4801-891d-b0b8ea78ccb0-kube-api-access-sr847\") pod \"ironic-operator-controller-manager-6f787dddc9-hqgd2\" (UID: \"4509daad-a22e-4801-891d-b0b8ea78ccb0\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-hqgd2" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.924809 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6j9d\" (UniqueName: \"kubernetes.io/projected/d854c129-c4eb-4c08-a398-3549f4ff9047-kube-api-access-f6j9d\") pod \"infra-operator-controller-manager-7b9c774f96-tfkfx\" (UID: \"d854c129-c4eb-4c08-a398-3549f4ff9047\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.924896 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkrkj\" (UniqueName: \"kubernetes.io/projected/14e0c597-a515-4b44-908e-3737f385d7c3-kube-api-access-jkrkj\") pod \"heat-operator-controller-manager-67dd5f86f5-v4ngr\" (UID: \"14e0c597-a515-4b44-908e-3737f385d7c3\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v4ngr" Mar 18 12:30:42 crc kubenswrapper[4975]: E0318 12:30:42.925282 4975 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:30:42 crc kubenswrapper[4975]: E0318 12:30:42.925478 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert podName:d854c129-c4eb-4c08-a398-3549f4ff9047 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:43.425429807 +0000 UTC m=+1229.139830386 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert") pod "infra-operator-controller-manager-7b9c774f96-tfkfx" (UID: "d854c129-c4eb-4c08-a398-3549f4ff9047") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.926959 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-dp5dh" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.948283 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm2bt\" (UniqueName: \"kubernetes.io/projected/2c3688cf-2e2c-434c-88a7-10ac1a4949b2-kube-api-access-lm2bt\") pod \"horizon-operator-controller-manager-8464cc45fb-jthrm\" (UID: \"2c3688cf-2e2c-434c-88a7-10ac1a4949b2\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jthrm" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.950118 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dkmp\" (UniqueName: \"kubernetes.io/projected/1b122375-f65a-4f05-a738-41eab6a8fcd3-kube-api-access-5dkmp\") pod \"glance-operator-controller-manager-79df6bcc97-wjt2j\" (UID: \"1b122375-f65a-4f05-a738-41eab6a8fcd3\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wjt2j" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.952707 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkrkj\" (UniqueName: \"kubernetes.io/projected/14e0c597-a515-4b44-908e-3737f385d7c3-kube-api-access-jkrkj\") pod \"heat-operator-controller-manager-67dd5f86f5-v4ngr\" (UID: \"14e0c597-a515-4b44-908e-3737f385d7c3\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v4ngr" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.954517 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.955458 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.961364 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-96rks" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.967692 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wjt2j" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.971866 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-tsmhh"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.976594 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tsmhh" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.980018 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-hpczw" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.984930 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v4ngr" Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.986797 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc"] Mar 18 12:30:42 crc kubenswrapper[4975]: I0318 12:30:42.987592 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6j9d\" (UniqueName: \"kubernetes.io/projected/d854c129-c4eb-4c08-a398-3549f4ff9047-kube-api-access-f6j9d\") pod \"infra-operator-controller-manager-7b9c774f96-tfkfx\" (UID: \"d854c129-c4eb-4c08-a398-3549f4ff9047\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:42.989026 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:42.996764 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-g5h8v" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:42.996949 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.000183 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-tsmhh"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.013731 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.037693 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr847\" (UniqueName: \"kubernetes.io/projected/4509daad-a22e-4801-891d-b0b8ea78ccb0-kube-api-access-sr847\") pod \"ironic-operator-controller-manager-6f787dddc9-hqgd2\" (UID: \"4509daad-a22e-4801-891d-b0b8ea78ccb0\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-hqgd2" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.037757 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5n2r\" (UniqueName: \"kubernetes.io/projected/dc261851-abff-4a1e-b20e-07d9c3bea942-kube-api-access-t5n2r\") pod \"manila-operator-controller-manager-55f864c847-qhnk6\" (UID: \"dc261851-abff-4a1e-b20e-07d9c3bea942\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-qhnk6" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.037803 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wzqt\" (UniqueName: \"kubernetes.io/projected/68cf7624-cb7b-45df-a18f-7dbfb9c20f6f-kube-api-access-9wzqt\") pod \"mariadb-operator-controller-manager-67ccfc9778-9bhc7\" (UID: \"68cf7624-cb7b-45df-a18f-7dbfb9c20f6f\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.037824 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf6db\" (UniqueName: \"kubernetes.io/projected/13b9a062-4728-4ce6-8d1b-0206bb73684e-kube-api-access-nf6db\") pod \"keystone-operator-controller-manager-768b96df4c-lcsc9\" (UID: \"13b9a062-4728-4ce6-8d1b-0206bb73684e\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-lcsc9" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.059342 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jthrm" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.060118 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.062632 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr847\" (UniqueName: \"kubernetes.io/projected/4509daad-a22e-4801-891d-b0b8ea78ccb0-kube-api-access-sr847\") pod \"ironic-operator-controller-manager-6f787dddc9-hqgd2\" (UID: \"4509daad-a22e-4801-891d-b0b8ea78ccb0\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-hqgd2" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.099419 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.107582 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.111325 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-99nf8" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.243288 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-m6rkz"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.244305 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-m6rkz" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.245978 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-jlnvb"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.247235 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-hqgd2" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.248817 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-fktkq" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.250552 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5n2r\" (UniqueName: \"kubernetes.io/projected/dc261851-abff-4a1e-b20e-07d9c3bea942-kube-api-access-t5n2r\") pod \"manila-operator-controller-manager-55f864c847-qhnk6\" (UID: \"dc261851-abff-4a1e-b20e-07d9c3bea942\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-qhnk6" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.250610 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wzqt\" (UniqueName: \"kubernetes.io/projected/68cf7624-cb7b-45df-a18f-7dbfb9c20f6f-kube-api-access-9wzqt\") pod \"mariadb-operator-controller-manager-67ccfc9778-9bhc7\" (UID: \"68cf7624-cb7b-45df-a18f-7dbfb9c20f6f\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.250637 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf6db\" (UniqueName: \"kubernetes.io/projected/13b9a062-4728-4ce6-8d1b-0206bb73684e-kube-api-access-nf6db\") pod \"keystone-operator-controller-manager-768b96df4c-lcsc9\" (UID: \"13b9a062-4728-4ce6-8d1b-0206bb73684e\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-lcsc9" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.250705 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhqwv\" (UniqueName: \"kubernetes.io/projected/7af89304-a07d-449c-9c16-97b829fa8290-kube-api-access-rhqwv\") pod \"octavia-operator-controller-manager-5b9f45d989-kmlsc\" (UID: \"7af89304-a07d-449c-9c16-97b829fa8290\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.250753 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl7tp\" (UniqueName: \"kubernetes.io/projected/3b4ea941-e8b1-47df-b33a-97dbf829cc24-kube-api-access-zl7tp\") pod \"nova-operator-controller-manager-5d488d59fb-tsmhh\" (UID: \"3b4ea941-e8b1-47df-b33a-97dbf829cc24\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tsmhh" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.250810 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fh2h\" (UniqueName: \"kubernetes.io/projected/aa02f9ee-8f7a-4880-b4c3-f2fcacd24967-kube-api-access-4fh2h\") pod \"neutron-operator-controller-manager-767865f676-d7vwb\" (UID: \"aa02f9ee-8f7a-4880-b4c3-f2fcacd24967\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.284947 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.285610 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.286110 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jlnvb" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.293461 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.294118 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5n2r\" (UniqueName: \"kubernetes.io/projected/dc261851-abff-4a1e-b20e-07d9c3bea942-kube-api-access-t5n2r\") pod \"manila-operator-controller-manager-55f864c847-qhnk6\" (UID: \"dc261851-abff-4a1e-b20e-07d9c3bea942\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-qhnk6" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.294296 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.294471 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf6db\" (UniqueName: \"kubernetes.io/projected/13b9a062-4728-4ce6-8d1b-0206bb73684e-kube-api-access-nf6db\") pod \"keystone-operator-controller-manager-768b96df4c-lcsc9\" (UID: \"13b9a062-4728-4ce6-8d1b-0206bb73684e\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-lcsc9" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.295836 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-4bbsl" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.299269 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wzqt\" (UniqueName: \"kubernetes.io/projected/68cf7624-cb7b-45df-a18f-7dbfb9c20f6f-kube-api-access-9wzqt\") pod \"mariadb-operator-controller-manager-67ccfc9778-9bhc7\" (UID: \"68cf7624-cb7b-45df-a18f-7dbfb9c20f6f\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.304649 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-h6759" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.304851 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-jlnvb"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.327830 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.351714 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qxq8\" (UniqueName: \"kubernetes.io/projected/e48eb7b2-9ce8-465c-9f05-91b4c55b4867-kube-api-access-4qxq8\") pod \"placement-operator-controller-manager-5784578c99-m6rkz\" (UID: \"e48eb7b2-9ce8-465c-9f05-91b4c55b4867\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-m6rkz" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.351826 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhqwv\" (UniqueName: \"kubernetes.io/projected/7af89304-a07d-449c-9c16-97b829fa8290-kube-api-access-rhqwv\") pod \"octavia-operator-controller-manager-5b9f45d989-kmlsc\" (UID: \"7af89304-a07d-449c-9c16-97b829fa8290\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.351856 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmhd9\" (UniqueName: \"kubernetes.io/projected/37a66a0e-cc0b-4e2b-8b41-62c23ae539d9-kube-api-access-mmhd9\") pod \"ovn-operator-controller-manager-884679f54-mwbb7\" (UID: \"37a66a0e-cc0b-4e2b-8b41-62c23ae539d9\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.351917 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl7tp\" (UniqueName: \"kubernetes.io/projected/3b4ea941-e8b1-47df-b33a-97dbf829cc24-kube-api-access-zl7tp\") pod \"nova-operator-controller-manager-5d488d59fb-tsmhh\" (UID: \"3b4ea941-e8b1-47df-b33a-97dbf829cc24\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tsmhh" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.351974 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fh2h\" (UniqueName: \"kubernetes.io/projected/aa02f9ee-8f7a-4880-b4c3-f2fcacd24967-kube-api-access-4fh2h\") pod \"neutron-operator-controller-manager-767865f676-d7vwb\" (UID: \"aa02f9ee-8f7a-4880-b4c3-f2fcacd24967\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.374460 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-m6rkz"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.380430 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.381343 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.395714 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl7tp\" (UniqueName: \"kubernetes.io/projected/3b4ea941-e8b1-47df-b33a-97dbf829cc24-kube-api-access-zl7tp\") pod \"nova-operator-controller-manager-5d488d59fb-tsmhh\" (UID: \"3b4ea941-e8b1-47df-b33a-97dbf829cc24\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tsmhh" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.406999 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.422443 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-cm4wx" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.424060 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fh2h\" (UniqueName: \"kubernetes.io/projected/aa02f9ee-8f7a-4880-b4c3-f2fcacd24967-kube-api-access-4fh2h\") pod \"neutron-operator-controller-manager-767865f676-d7vwb\" (UID: \"aa02f9ee-8f7a-4880-b4c3-f2fcacd24967\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.424910 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.426163 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.432915 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-sjmgq" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.435158 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.462833 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2tvr\" (UniqueName: \"kubernetes.io/projected/b224d92b-1aed-47b5-8825-8f0b11da3092-kube-api-access-t2tvr\") pod \"swift-operator-controller-manager-c674c5965-jlnvb\" (UID: \"b224d92b-1aed-47b5-8825-8f0b11da3092\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-jlnvb" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.462949 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-lxkwk\" (UID: \"0542b387-d20c-41a1-81f3-1a11228e0a5c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.463001 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qxq8\" (UniqueName: \"kubernetes.io/projected/e48eb7b2-9ce8-465c-9f05-91b4c55b4867-kube-api-access-4qxq8\") pod \"placement-operator-controller-manager-5784578c99-m6rkz\" (UID: \"e48eb7b2-9ce8-465c-9f05-91b4c55b4867\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-m6rkz" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.463112 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmhd9\" (UniqueName: \"kubernetes.io/projected/37a66a0e-cc0b-4e2b-8b41-62c23ae539d9-kube-api-access-mmhd9\") pod \"ovn-operator-controller-manager-884679f54-mwbb7\" (UID: \"37a66a0e-cc0b-4e2b-8b41-62c23ae539d9\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.463190 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert\") pod \"infra-operator-controller-manager-7b9c774f96-tfkfx\" (UID: \"d854c129-c4eb-4c08-a398-3549f4ff9047\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.463243 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njztb\" (UniqueName: \"kubernetes.io/projected/0542b387-d20c-41a1-81f3-1a11228e0a5c-kube-api-access-njztb\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-lxkwk\" (UID: \"0542b387-d20c-41a1-81f3-1a11228e0a5c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" Mar 18 12:30:43 crc kubenswrapper[4975]: E0318 12:30:43.464064 4975 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:30:43 crc kubenswrapper[4975]: E0318 12:30:43.464135 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert podName:d854c129-c4eb-4c08-a398-3549f4ff9047 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:44.464114523 +0000 UTC m=+1230.178515112 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert") pod "infra-operator-controller-manager-7b9c774f96-tfkfx" (UID: "d854c129-c4eb-4c08-a398-3549f4ff9047") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.484451 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.485629 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.492957 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-ngsmq" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.499507 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.501470 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qxq8\" (UniqueName: \"kubernetes.io/projected/e48eb7b2-9ce8-465c-9f05-91b4c55b4867-kube-api-access-4qxq8\") pod \"placement-operator-controller-manager-5784578c99-m6rkz\" (UID: \"e48eb7b2-9ce8-465c-9f05-91b4c55b4867\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-m6rkz" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.503465 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-lcsc9" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.508759 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmhd9\" (UniqueName: \"kubernetes.io/projected/37a66a0e-cc0b-4e2b-8b41-62c23ae539d9-kube-api-access-mmhd9\") pod \"ovn-operator-controller-manager-884679f54-mwbb7\" (UID: \"37a66a0e-cc0b-4e2b-8b41-62c23ae539d9\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.527652 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.528846 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.532943 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.533308 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.533736 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-9m7jc" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.544253 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.544960 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-qhnk6" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.570598 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcjk2\" (UniqueName: \"kubernetes.io/projected/475c84a8-c3d3-4bf8-91e0-244d5ffe1c9e-kube-api-access-rcjk2\") pod \"telemetry-operator-controller-manager-d6b694c5-xfr6h\" (UID: \"475c84a8-c3d3-4bf8-91e0-244d5ffe1c9e\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.571108 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njztb\" (UniqueName: \"kubernetes.io/projected/0542b387-d20c-41a1-81f3-1a11228e0a5c-kube-api-access-njztb\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-lxkwk\" (UID: \"0542b387-d20c-41a1-81f3-1a11228e0a5c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.571171 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2tvr\" (UniqueName: \"kubernetes.io/projected/b224d92b-1aed-47b5-8825-8f0b11da3092-kube-api-access-t2tvr\") pod \"swift-operator-controller-manager-c674c5965-jlnvb\" (UID: \"b224d92b-1aed-47b5-8825-8f0b11da3092\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-jlnvb" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.571206 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftfbf\" (UniqueName: \"kubernetes.io/projected/fc5647e0-697f-490e-9413-9fb2e63b22d8-kube-api-access-ftfbf\") pod \"test-operator-controller-manager-5c5cb9c4d7-d2wsm\" (UID: \"fc5647e0-697f-490e-9413-9fb2e63b22d8\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.571230 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-lxkwk\" (UID: \"0542b387-d20c-41a1-81f3-1a11228e0a5c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" Mar 18 12:30:43 crc kubenswrapper[4975]: E0318 12:30:43.571373 4975 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:30:43 crc kubenswrapper[4975]: E0318 12:30:43.571529 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert podName:0542b387-d20c-41a1-81f3-1a11228e0a5c nodeName:}" failed. No retries permitted until 2026-03-18 12:30:44.071512045 +0000 UTC m=+1229.785912624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" (UID: "0542b387-d20c-41a1-81f3-1a11228e0a5c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.585439 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.585699 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.586208 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhqwv\" (UniqueName: \"kubernetes.io/projected/7af89304-a07d-449c-9c16-97b829fa8290-kube-api-access-rhqwv\") pod \"octavia-operator-controller-manager-5b9f45d989-kmlsc\" (UID: \"7af89304-a07d-449c-9c16-97b829fa8290\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.621085 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njztb\" (UniqueName: \"kubernetes.io/projected/0542b387-d20c-41a1-81f3-1a11228e0a5c-kube-api-access-njztb\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-lxkwk\" (UID: \"0542b387-d20c-41a1-81f3-1a11228e0a5c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.621995 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2tvr\" (UniqueName: \"kubernetes.io/projected/b224d92b-1aed-47b5-8825-8f0b11da3092-kube-api-access-t2tvr\") pod \"swift-operator-controller-manager-c674c5965-jlnvb\" (UID: \"b224d92b-1aed-47b5-8825-8f0b11da3092\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-jlnvb" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.625103 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tsmhh" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.636447 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.649152 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-m6rkz" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.651187 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.674000 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8ss2\" (UniqueName: \"kubernetes.io/projected/fff94ef3-68c9-412e-adc7-c385a89a445f-kube-api-access-x8ss2\") pod \"watcher-operator-controller-manager-6c4d75f7f9-btfw4\" (UID: \"fff94ef3-68c9-412e-adc7-c385a89a445f\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.674071 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.674142 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.674183 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftfbf\" (UniqueName: \"kubernetes.io/projected/fc5647e0-697f-490e-9413-9fb2e63b22d8-kube-api-access-ftfbf\") pod \"test-operator-controller-manager-5c5cb9c4d7-d2wsm\" (UID: \"fc5647e0-697f-490e-9413-9fb2e63b22d8\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.674265 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfwpq\" (UniqueName: \"kubernetes.io/projected/07e8604b-3e82-4a30-8f59-e240bd72d1a3-kube-api-access-jfwpq\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.674351 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcjk2\" (UniqueName: \"kubernetes.io/projected/475c84a8-c3d3-4bf8-91e0-244d5ffe1c9e-kube-api-access-rcjk2\") pod \"telemetry-operator-controller-manager-d6b694c5-xfr6h\" (UID: \"475c84a8-c3d3-4bf8-91e0-244d5ffe1c9e\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.698418 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcjk2\" (UniqueName: \"kubernetes.io/projected/475c84a8-c3d3-4bf8-91e0-244d5ffe1c9e-kube-api-access-rcjk2\") pod \"telemetry-operator-controller-manager-d6b694c5-xfr6h\" (UID: \"475c84a8-c3d3-4bf8-91e0-244d5ffe1c9e\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.703741 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftfbf\" (UniqueName: \"kubernetes.io/projected/fc5647e0-697f-490e-9413-9fb2e63b22d8-kube-api-access-ftfbf\") pod \"test-operator-controller-manager-5c5cb9c4d7-d2wsm\" (UID: \"fc5647e0-697f-490e-9413-9fb2e63b22d8\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.703818 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c285h"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.704820 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c285h" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.706302 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c285h"] Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.707335 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-djvtf" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.775304 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8ss2\" (UniqueName: \"kubernetes.io/projected/fff94ef3-68c9-412e-adc7-c385a89a445f-kube-api-access-x8ss2\") pod \"watcher-operator-controller-manager-6c4d75f7f9-btfw4\" (UID: \"fff94ef3-68c9-412e-adc7-c385a89a445f\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.775343 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.775381 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.775420 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfwpq\" (UniqueName: \"kubernetes.io/projected/07e8604b-3e82-4a30-8f59-e240bd72d1a3-kube-api-access-jfwpq\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:43 crc kubenswrapper[4975]: E0318 12:30:43.775890 4975 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:30:43 crc kubenswrapper[4975]: E0318 12:30:43.775930 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs podName:07e8604b-3e82-4a30-8f59-e240bd72d1a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:44.275916505 +0000 UTC m=+1229.990317084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-qk6mw" (UID: "07e8604b-3e82-4a30-8f59-e240bd72d1a3") : secret "metrics-server-cert" not found Mar 18 12:30:43 crc kubenswrapper[4975]: E0318 12:30:43.776096 4975 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:30:43 crc kubenswrapper[4975]: E0318 12:30:43.776123 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs podName:07e8604b-3e82-4a30-8f59-e240bd72d1a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:44.276115701 +0000 UTC m=+1229.990516280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-qk6mw" (UID: "07e8604b-3e82-4a30-8f59-e240bd72d1a3") : secret "webhook-server-cert" not found Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.794286 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jlnvb" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.796559 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfwpq\" (UniqueName: \"kubernetes.io/projected/07e8604b-3e82-4a30-8f59-e240bd72d1a3-kube-api-access-jfwpq\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.826600 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.853076 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.877181 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxcbx\" (UniqueName: \"kubernetes.io/projected/fb3873ad-2d10-4df3-8198-2acd5c04b8c2-kube-api-access-nxcbx\") pod \"rabbitmq-cluster-operator-manager-668c99d594-c285h\" (UID: \"fb3873ad-2d10-4df3-8198-2acd5c04b8c2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c285h" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.983924 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxcbx\" (UniqueName: \"kubernetes.io/projected/fb3873ad-2d10-4df3-8198-2acd5c04b8c2-kube-api-access-nxcbx\") pod \"rabbitmq-cluster-operator-manager-668c99d594-c285h\" (UID: \"fb3873ad-2d10-4df3-8198-2acd5c04b8c2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c285h" Mar 18 12:30:43 crc kubenswrapper[4975]: I0318 12:30:43.988169 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8ss2\" (UniqueName: \"kubernetes.io/projected/fff94ef3-68c9-412e-adc7-c385a89a445f-kube-api-access-x8ss2\") pod \"watcher-operator-controller-manager-6c4d75f7f9-btfw4\" (UID: \"fff94ef3-68c9-412e-adc7-c385a89a445f\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4" Mar 18 12:30:44 crc kubenswrapper[4975]: I0318 12:30:44.012811 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxcbx\" (UniqueName: \"kubernetes.io/projected/fb3873ad-2d10-4df3-8198-2acd5c04b8c2-kube-api-access-nxcbx\") pod \"rabbitmq-cluster-operator-manager-668c99d594-c285h\" (UID: \"fb3873ad-2d10-4df3-8198-2acd5c04b8c2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c285h" Mar 18 12:30:44 crc kubenswrapper[4975]: I0318 12:30:44.087463 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-lxkwk\" (UID: \"0542b387-d20c-41a1-81f3-1a11228e0a5c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" Mar 18 12:30:44 crc kubenswrapper[4975]: E0318 12:30:44.090815 4975 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:30:44 crc kubenswrapper[4975]: E0318 12:30:44.091073 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert podName:0542b387-d20c-41a1-81f3-1a11228e0a5c nodeName:}" failed. No retries permitted until 2026-03-18 12:30:45.091025708 +0000 UTC m=+1230.805426287 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" (UID: "0542b387-d20c-41a1-81f3-1a11228e0a5c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:30:44 crc kubenswrapper[4975]: I0318 12:30:44.243093 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c285h" Mar 18 12:30:44 crc kubenswrapper[4975]: I0318 12:30:44.243392 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4" Mar 18 12:30:44 crc kubenswrapper[4975]: I0318 12:30:44.290972 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:44 crc kubenswrapper[4975]: I0318 12:30:44.291038 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:44 crc kubenswrapper[4975]: E0318 12:30:44.291189 4975 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:30:44 crc kubenswrapper[4975]: E0318 12:30:44.291236 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs podName:07e8604b-3e82-4a30-8f59-e240bd72d1a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:45.291220713 +0000 UTC m=+1231.005621292 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-qk6mw" (UID: "07e8604b-3e82-4a30-8f59-e240bd72d1a3") : secret "webhook-server-cert" not found Mar 18 12:30:44 crc kubenswrapper[4975]: E0318 12:30:44.291275 4975 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:30:44 crc kubenswrapper[4975]: E0318 12:30:44.291292 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs podName:07e8604b-3e82-4a30-8f59-e240bd72d1a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:45.291286455 +0000 UTC m=+1231.005687034 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-qk6mw" (UID: "07e8604b-3e82-4a30-8f59-e240bd72d1a3") : secret "metrics-server-cert" not found Mar 18 12:30:44 crc kubenswrapper[4975]: I0318 12:30:44.618930 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert\") pod \"infra-operator-controller-manager-7b9c774f96-tfkfx\" (UID: \"d854c129-c4eb-4c08-a398-3549f4ff9047\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:30:44 crc kubenswrapper[4975]: E0318 12:30:44.619118 4975 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:30:44 crc kubenswrapper[4975]: E0318 12:30:44.619192 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert podName:d854c129-c4eb-4c08-a398-3549f4ff9047 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:46.619168135 +0000 UTC m=+1232.333568714 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert") pod "infra-operator-controller-manager-7b9c774f96-tfkfx" (UID: "d854c129-c4eb-4c08-a398-3549f4ff9047") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.141689 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-lxkwk\" (UID: \"0542b387-d20c-41a1-81f3-1a11228e0a5c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.142145 4975 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.142203 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert podName:0542b387-d20c-41a1-81f3-1a11228e0a5c nodeName:}" failed. No retries permitted until 2026-03-18 12:30:47.142184583 +0000 UTC m=+1232.856585162 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" (UID: "0542b387-d20c-41a1-81f3-1a11228e0a5c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.317983 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-9hd99"] Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.339991 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-wjt2j"] Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.345488 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.345844 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.346091 4975 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.346215 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs podName:07e8604b-3e82-4a30-8f59-e240bd72d1a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:47.346125173 +0000 UTC m=+1233.060525752 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-qk6mw" (UID: "07e8604b-3e82-4a30-8f59-e240bd72d1a3") : secret "webhook-server-cert" not found Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.346587 4975 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.346677 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs podName:07e8604b-3e82-4a30-8f59-e240bd72d1a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:47.346663977 +0000 UTC m=+1233.061064556 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-qk6mw" (UID: "07e8604b-3e82-4a30-8f59-e240bd72d1a3") : secret "metrics-server-cert" not found Mar 18 12:30:45 crc kubenswrapper[4975]: W0318 12:30:45.347336 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b122375_f65a_4f05_a738_41eab6a8fcd3.slice/crio-a25b1d57ce96a67c57031685a944af21cf4b61d2fdfc0490b04a1d5bf718bd87 WatchSource:0}: Error finding container a25b1d57ce96a67c57031685a944af21cf4b61d2fdfc0490b04a1d5bf718bd87: Status 404 returned error can't find the container with id a25b1d57ce96a67c57031685a944af21cf4b61d2fdfc0490b04a1d5bf718bd87 Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.353206 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-dp5dh"] Mar 18 12:30:45 crc kubenswrapper[4975]: W0318 12:30:45.355458 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod675b1757_67bf_4d6d_9947_31a4da13a1be.slice/crio-457c66a685e0a381725d7cc5eecc8284e0bc5d2de098155f99d66c752961f620 WatchSource:0}: Error finding container 457c66a685e0a381725d7cc5eecc8284e0bc5d2de098155f99d66c752961f620: Status 404 returned error can't find the container with id 457c66a685e0a381725d7cc5eecc8284e0bc5d2de098155f99d66c752961f620 Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.397333 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-v4ngr"] Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.457700 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-7lzk9"] Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.462520 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-dp5dh" event={"ID":"675b1757-67bf-4d6d-9947-31a4da13a1be","Type":"ContainerStarted","Data":"457c66a685e0a381725d7cc5eecc8284e0bc5d2de098155f99d66c752961f620"} Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.464149 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9hd99" event={"ID":"4b5bf68a-b7e7-4df7-ab69-2b8eac9c1091","Type":"ContainerStarted","Data":"5d92685b850e728723878fb3c5ea1217e423538d5a8f20a43a685300879557a5"} Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.465333 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v4ngr" event={"ID":"14e0c597-a515-4b44-908e-3737f385d7c3","Type":"ContainerStarted","Data":"fdcd9f4c8d50708b7c9b451ae429d182ec11f9d7d66883942709b0c9c1695ffa"} Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.466427 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wjt2j" event={"ID":"1b122375-f65a-4f05-a738-41eab6a8fcd3","Type":"ContainerStarted","Data":"a25b1d57ce96a67c57031685a944af21cf4b61d2fdfc0490b04a1d5bf718bd87"} Mar 18 12:30:45 crc kubenswrapper[4975]: W0318 12:30:45.471490 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84bd8990_a50a_4fb2_88d3_e3141ef24b7d.slice/crio-a47e9aedd2a7db1e1e4e57804725bcec8642c4fc7879ca3326cc9eff4b5ca5cd WatchSource:0}: Error finding container a47e9aedd2a7db1e1e4e57804725bcec8642c4fc7879ca3326cc9eff4b5ca5cd: Status 404 returned error can't find the container with id a47e9aedd2a7db1e1e4e57804725bcec8642c4fc7879ca3326cc9eff4b5ca5cd Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.540781 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-jthrm"] Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.545973 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-lcsc9"] Mar 18 12:30:45 crc kubenswrapper[4975]: W0318 12:30:45.546375 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c3688cf_2e2c_434c_88a7_10ac1a4949b2.slice/crio-599ece486d61fc64f446969f9591b8f87368517c989396008f40ba5af43bc241 WatchSource:0}: Error finding container 599ece486d61fc64f446969f9591b8f87368517c989396008f40ba5af43bc241: Status 404 returned error can't find the container with id 599ece486d61fc64f446969f9591b8f87368517c989396008f40ba5af43bc241 Mar 18 12:30:45 crc kubenswrapper[4975]: W0318 12:30:45.550098 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode48eb7b2_9ce8_465c_9f05_91b4c55b4867.slice/crio-ea00cec2962cb8b2da927d5f7d89af25b98a0036f2d0ae7615340edcfb460343 WatchSource:0}: Error finding container ea00cec2962cb8b2da927d5f7d89af25b98a0036f2d0ae7615340edcfb460343: Status 404 returned error can't find the container with id ea00cec2962cb8b2da927d5f7d89af25b98a0036f2d0ae7615340edcfb460343 Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.553025 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-m6rkz"] Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.564923 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-jlnvb"] Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.569530 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-qhnk6"] Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.583896 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-hqgd2"] Mar 18 12:30:45 crc kubenswrapper[4975]: W0318 12:30:45.591273 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4509daad_a22e_4801_891d_b0b8ea78ccb0.slice/crio-029d93091ae0834e7bac4d3e4f496b0e75c8b3a78e4ac7c07f773a999c976322 WatchSource:0}: Error finding container 029d93091ae0834e7bac4d3e4f496b0e75c8b3a78e4ac7c07f773a999c976322: Status 404 returned error can't find the container with id 029d93091ae0834e7bac4d3e4f496b0e75c8b3a78e4ac7c07f773a999c976322 Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.706720 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-tsmhh"] Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.715452 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc"] Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.736912 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm"] Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.745230 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb"] Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.752009 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7"] Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.763302 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9wzqt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-9bhc7_openstack-operators(68cf7624-cb7b-45df-a18f-7dbfb9c20f6f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.764064 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rhqwv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-kmlsc_openstack-operators(7af89304-a07d-449c-9c16-97b829fa8290): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.765149 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7" podUID="68cf7624-cb7b-45df-a18f-7dbfb9c20f6f" Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.765203 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc" podUID="7af89304-a07d-449c-9c16-97b829fa8290" Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.765550 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4fh2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-d7vwb_openstack-operators(aa02f9ee-8f7a-4880-b4c3-f2fcacd24967): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.766340 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ftfbf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-d2wsm_openstack-operators(fc5647e0-697f-490e-9413-9fb2e63b22d8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.766690 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb" podUID="aa02f9ee-8f7a-4880-b4c3-f2fcacd24967" Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.767792 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm" podUID="fc5647e0-697f-490e-9413-9fb2e63b22d8" Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.842226 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c285h"] Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.860314 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4"] Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.868488 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nxcbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-c285h_openstack-operators(fb3873ad-2d10-4df3-8198-2acd5c04b8c2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.869938 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7"] Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.870004 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c285h" podUID="fb3873ad-2d10-4df3-8198-2acd5c04b8c2" Mar 18 12:30:45 crc kubenswrapper[4975]: I0318 12:30:45.878030 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h"] Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.880762 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x8ss2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-btfw4_openstack-operators(fff94ef3-68c9-412e-adc7-c385a89a445f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:30:45 crc kubenswrapper[4975]: W0318 12:30:45.881717 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod475c84a8_c3d3_4bf8_91e0_244d5ffe1c9e.slice/crio-3ddca410789246c812f60cf6c58cab20b56e0d9c1c563ffe763d92d3e2befc30 WatchSource:0}: Error finding container 3ddca410789246c812f60cf6c58cab20b56e0d9c1c563ffe763d92d3e2befc30: Status 404 returned error can't find the container with id 3ddca410789246c812f60cf6c58cab20b56e0d9c1c563ffe763d92d3e2befc30 Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.882410 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4" podUID="fff94ef3-68c9-412e-adc7-c385a89a445f" Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.883992 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rcjk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-xfr6h_openstack-operators(475c84a8-c3d3-4bf8-91e0-244d5ffe1c9e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.885210 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h" podUID="475c84a8-c3d3-4bf8-91e0-244d5ffe1c9e" Mar 18 12:30:45 crc kubenswrapper[4975]: W0318 12:30:45.886568 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a66a0e_cc0b_4e2b_8b41_62c23ae539d9.slice/crio-15d738241cdf23ae7d917efb22a09ce7e0da1a8e3868c57e467392e2e493cde8 WatchSource:0}: Error finding container 15d738241cdf23ae7d917efb22a09ce7e0da1a8e3868c57e467392e2e493cde8: Status 404 returned error can't find the container with id 15d738241cdf23ae7d917efb22a09ce7e0da1a8e3868c57e467392e2e493cde8 Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.888682 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mmhd9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-mwbb7_openstack-operators(37a66a0e-cc0b-4e2b-8b41-62c23ae539d9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 12:30:45 crc kubenswrapper[4975]: E0318 12:30:45.889940 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7" podUID="37a66a0e-cc0b-4e2b-8b41-62c23ae539d9" Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.479735 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc" event={"ID":"7af89304-a07d-449c-9c16-97b829fa8290","Type":"ContainerStarted","Data":"b82f0d97faab0b93c1b824d3f67221308950ab4ec5ae19de92e76b4177f2dff2"} Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.482602 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm" event={"ID":"fc5647e0-697f-490e-9413-9fb2e63b22d8","Type":"ContainerStarted","Data":"3d15f24bb4639c25646e455099cb1f37f1671116c2e4791ba9086ff631602608"} Mar 18 12:30:46 crc kubenswrapper[4975]: E0318 12:30:46.484264 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm" podUID="fc5647e0-697f-490e-9413-9fb2e63b22d8" Mar 18 12:30:46 crc kubenswrapper[4975]: E0318 12:30:46.484542 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc" podUID="7af89304-a07d-449c-9c16-97b829fa8290" Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.487446 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tsmhh" event={"ID":"3b4ea941-e8b1-47df-b33a-97dbf829cc24","Type":"ContainerStarted","Data":"71bf2e322166259e33a1c63719ece56dcb83adb57310f64f2723ccc620e333e9"} Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.488962 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7lzk9" event={"ID":"84bd8990-a50a-4fb2-88d3-e3141ef24b7d","Type":"ContainerStarted","Data":"a47e9aedd2a7db1e1e4e57804725bcec8642c4fc7879ca3326cc9eff4b5ca5cd"} Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.523348 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4" event={"ID":"fff94ef3-68c9-412e-adc7-c385a89a445f","Type":"ContainerStarted","Data":"67b3988e935177e96c45bbe51fcf485c5295599258941808870807cb4f1732a0"} Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.524765 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jthrm" event={"ID":"2c3688cf-2e2c-434c-88a7-10ac1a4949b2","Type":"ContainerStarted","Data":"599ece486d61fc64f446969f9591b8f87368517c989396008f40ba5af43bc241"} Mar 18 12:30:46 crc kubenswrapper[4975]: E0318 12:30:46.525888 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4" podUID="fff94ef3-68c9-412e-adc7-c385a89a445f" Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.526419 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7" event={"ID":"37a66a0e-cc0b-4e2b-8b41-62c23ae539d9","Type":"ContainerStarted","Data":"15d738241cdf23ae7d917efb22a09ce7e0da1a8e3868c57e467392e2e493cde8"} Mar 18 12:30:46 crc kubenswrapper[4975]: E0318 12:30:46.529505 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7" podUID="37a66a0e-cc0b-4e2b-8b41-62c23ae539d9" Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.533401 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-qhnk6" event={"ID":"dc261851-abff-4a1e-b20e-07d9c3bea942","Type":"ContainerStarted","Data":"d74848a83423dc325ab750e865c3019be905c133e2275fcbeae097a595caea28"} Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.542542 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-hqgd2" event={"ID":"4509daad-a22e-4801-891d-b0b8ea78ccb0","Type":"ContainerStarted","Data":"029d93091ae0834e7bac4d3e4f496b0e75c8b3a78e4ac7c07f773a999c976322"} Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.544932 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb" event={"ID":"aa02f9ee-8f7a-4880-b4c3-f2fcacd24967","Type":"ContainerStarted","Data":"e1907bd512dd59368a27090c21aae7b8e039e4ec0548783edf2959b76f1e59a5"} Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.546849 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-m6rkz" event={"ID":"e48eb7b2-9ce8-465c-9f05-91b4c55b4867","Type":"ContainerStarted","Data":"ea00cec2962cb8b2da927d5f7d89af25b98a0036f2d0ae7615340edcfb460343"} Mar 18 12:30:46 crc kubenswrapper[4975]: E0318 12:30:46.547009 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb" podUID="aa02f9ee-8f7a-4880-b4c3-f2fcacd24967" Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.550173 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jlnvb" event={"ID":"b224d92b-1aed-47b5-8825-8f0b11da3092","Type":"ContainerStarted","Data":"23b06ce5286484743239d929aaa602f36df7a54d2b4ed96aca8589bcaf8970d9"} Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.552433 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-lcsc9" event={"ID":"13b9a062-4728-4ce6-8d1b-0206bb73684e","Type":"ContainerStarted","Data":"1f13d05b569df9e60aba68d29caca255a3105989fe718359353f67c1d010cb02"} Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.555232 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h" event={"ID":"475c84a8-c3d3-4bf8-91e0-244d5ffe1c9e","Type":"ContainerStarted","Data":"3ddca410789246c812f60cf6c58cab20b56e0d9c1c563ffe763d92d3e2befc30"} Mar 18 12:30:46 crc kubenswrapper[4975]: E0318 12:30:46.558964 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h" podUID="475c84a8-c3d3-4bf8-91e0-244d5ffe1c9e" Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.559968 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c285h" event={"ID":"fb3873ad-2d10-4df3-8198-2acd5c04b8c2","Type":"ContainerStarted","Data":"e52f77f67f0c05096d7cfdd41a4f8437300a8ad0f21b2f4ffdf2715cb9812977"} Mar 18 12:30:46 crc kubenswrapper[4975]: E0318 12:30:46.561435 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c285h" podUID="fb3873ad-2d10-4df3-8198-2acd5c04b8c2" Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.562796 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7" event={"ID":"68cf7624-cb7b-45df-a18f-7dbfb9c20f6f","Type":"ContainerStarted","Data":"285d748f11c95d79a7ec2178c467802e9497895b006ba408f44bf62a388f3421"} Mar 18 12:30:46 crc kubenswrapper[4975]: E0318 12:30:46.567083 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7" podUID="68cf7624-cb7b-45df-a18f-7dbfb9c20f6f" Mar 18 12:30:46 crc kubenswrapper[4975]: I0318 12:30:46.669369 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert\") pod \"infra-operator-controller-manager-7b9c774f96-tfkfx\" (UID: \"d854c129-c4eb-4c08-a398-3549f4ff9047\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:30:46 crc kubenswrapper[4975]: E0318 12:30:46.672237 4975 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:30:46 crc kubenswrapper[4975]: E0318 12:30:46.672329 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert podName:d854c129-c4eb-4c08-a398-3549f4ff9047 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:50.672268469 +0000 UTC m=+1236.386669048 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert") pod "infra-operator-controller-manager-7b9c774f96-tfkfx" (UID: "d854c129-c4eb-4c08-a398-3549f4ff9047") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:30:47 crc kubenswrapper[4975]: I0318 12:30:47.189063 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-lxkwk\" (UID: \"0542b387-d20c-41a1-81f3-1a11228e0a5c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" Mar 18 12:30:47 crc kubenswrapper[4975]: E0318 12:30:47.189473 4975 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:30:47 crc kubenswrapper[4975]: E0318 12:30:47.189549 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert podName:0542b387-d20c-41a1-81f3-1a11228e0a5c nodeName:}" failed. No retries permitted until 2026-03-18 12:30:51.189523296 +0000 UTC m=+1236.903923875 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" (UID: "0542b387-d20c-41a1-81f3-1a11228e0a5c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:30:47 crc kubenswrapper[4975]: I0318 12:30:47.391713 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:47 crc kubenswrapper[4975]: I0318 12:30:47.391821 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:47 crc kubenswrapper[4975]: E0318 12:30:47.392089 4975 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:30:47 crc kubenswrapper[4975]: E0318 12:30:47.392174 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs podName:07e8604b-3e82-4a30-8f59-e240bd72d1a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:51.392146441 +0000 UTC m=+1237.106547020 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-qk6mw" (UID: "07e8604b-3e82-4a30-8f59-e240bd72d1a3") : secret "webhook-server-cert" not found Mar 18 12:30:47 crc kubenswrapper[4975]: E0318 12:30:47.392251 4975 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:30:47 crc kubenswrapper[4975]: E0318 12:30:47.392295 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs podName:07e8604b-3e82-4a30-8f59-e240bd72d1a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:51.392283125 +0000 UTC m=+1237.106683704 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-qk6mw" (UID: "07e8604b-3e82-4a30-8f59-e240bd72d1a3") : secret "metrics-server-cert" not found Mar 18 12:30:47 crc kubenswrapper[4975]: E0318 12:30:47.608412 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc" podUID="7af89304-a07d-449c-9c16-97b829fa8290" Mar 18 12:30:47 crc kubenswrapper[4975]: E0318 12:30:47.608611 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h" podUID="475c84a8-c3d3-4bf8-91e0-244d5ffe1c9e" Mar 18 12:30:47 crc kubenswrapper[4975]: E0318 12:30:47.608740 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4" podUID="fff94ef3-68c9-412e-adc7-c385a89a445f" Mar 18 12:30:47 crc kubenswrapper[4975]: E0318 12:30:47.608817 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm" podUID="fc5647e0-697f-490e-9413-9fb2e63b22d8" Mar 18 12:30:47 crc kubenswrapper[4975]: E0318 12:30:47.608932 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c285h" podUID="fb3873ad-2d10-4df3-8198-2acd5c04b8c2" Mar 18 12:30:47 crc kubenswrapper[4975]: E0318 12:30:47.609001 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7" podUID="68cf7624-cb7b-45df-a18f-7dbfb9c20f6f" Mar 18 12:30:47 crc kubenswrapper[4975]: E0318 12:30:47.609063 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7" podUID="37a66a0e-cc0b-4e2b-8b41-62c23ae539d9" Mar 18 12:30:47 crc kubenswrapper[4975]: E0318 12:30:47.609759 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb" podUID="aa02f9ee-8f7a-4880-b4c3-f2fcacd24967" Mar 18 12:30:50 crc kubenswrapper[4975]: I0318 12:30:50.743753 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert\") pod \"infra-operator-controller-manager-7b9c774f96-tfkfx\" (UID: \"d854c129-c4eb-4c08-a398-3549f4ff9047\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:30:50 crc kubenswrapper[4975]: E0318 12:30:50.743894 4975 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:30:50 crc kubenswrapper[4975]: E0318 12:30:50.744238 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert podName:d854c129-c4eb-4c08-a398-3549f4ff9047 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:58.744217274 +0000 UTC m=+1244.458617853 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert") pod "infra-operator-controller-manager-7b9c774f96-tfkfx" (UID: "d854c129-c4eb-4c08-a398-3549f4ff9047") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:30:51 crc kubenswrapper[4975]: I0318 12:30:51.249715 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-lxkwk\" (UID: \"0542b387-d20c-41a1-81f3-1a11228e0a5c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" Mar 18 12:30:51 crc kubenswrapper[4975]: E0318 12:30:51.249915 4975 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:30:51 crc kubenswrapper[4975]: E0318 12:30:51.249968 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert podName:0542b387-d20c-41a1-81f3-1a11228e0a5c nodeName:}" failed. No retries permitted until 2026-03-18 12:30:59.249952436 +0000 UTC m=+1244.964353015 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" (UID: "0542b387-d20c-41a1-81f3-1a11228e0a5c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 12:30:51 crc kubenswrapper[4975]: I0318 12:30:51.451855 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:51 crc kubenswrapper[4975]: E0318 12:30:51.451978 4975 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:30:51 crc kubenswrapper[4975]: I0318 12:30:51.452028 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:51 crc kubenswrapper[4975]: E0318 12:30:51.452040 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs podName:07e8604b-3e82-4a30-8f59-e240bd72d1a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:59.452024266 +0000 UTC m=+1245.166424845 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-qk6mw" (UID: "07e8604b-3e82-4a30-8f59-e240bd72d1a3") : secret "webhook-server-cert" not found Mar 18 12:30:51 crc kubenswrapper[4975]: E0318 12:30:51.452140 4975 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:30:51 crc kubenswrapper[4975]: E0318 12:30:51.452175 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs podName:07e8604b-3e82-4a30-8f59-e240bd72d1a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:30:59.45216375 +0000 UTC m=+1245.166564329 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-qk6mw" (UID: "07e8604b-3e82-4a30-8f59-e240bd72d1a3") : secret "metrics-server-cert" not found Mar 18 12:30:58 crc kubenswrapper[4975]: E0318 12:30:58.168780 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da" Mar 18 12:30:58 crc kubenswrapper[4975]: E0318 12:30:58.169632 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t5n2r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-55f864c847-qhnk6_openstack-operators(dc261851-abff-4a1e-b20e-07d9c3bea942): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:30:58 crc kubenswrapper[4975]: E0318 12:30:58.172127 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-qhnk6" podUID="dc261851-abff-4a1e-b20e-07d9c3bea942" Mar 18 12:30:58 crc kubenswrapper[4975]: E0318 12:30:58.721991 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-qhnk6" podUID="dc261851-abff-4a1e-b20e-07d9c3bea942" Mar 18 12:30:58 crc kubenswrapper[4975]: I0318 12:30:58.818340 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert\") pod \"infra-operator-controller-manager-7b9c774f96-tfkfx\" (UID: \"d854c129-c4eb-4c08-a398-3549f4ff9047\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:30:58 crc kubenswrapper[4975]: E0318 12:30:58.818499 4975 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 12:30:58 crc kubenswrapper[4975]: E0318 12:30:58.818554 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert podName:d854c129-c4eb-4c08-a398-3549f4ff9047 nodeName:}" failed. No retries permitted until 2026-03-18 12:31:14.818537611 +0000 UTC m=+1260.532938190 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert") pod "infra-operator-controller-manager-7b9c774f96-tfkfx" (UID: "d854c129-c4eb-4c08-a398-3549f4ff9047") : secret "infra-operator-webhook-server-cert" not found Mar 18 12:30:59 crc kubenswrapper[4975]: E0318 12:30:59.190157 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113" Mar 18 12:30:59 crc kubenswrapper[4975]: E0318 12:30:59.190357 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lm2bt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-8464cc45fb-jthrm_openstack-operators(2c3688cf-2e2c-434c-88a7-10ac1a4949b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:30:59 crc kubenswrapper[4975]: E0318 12:30:59.191527 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jthrm" podUID="2c3688cf-2e2c-434c-88a7-10ac1a4949b2" Mar 18 12:30:59 crc kubenswrapper[4975]: I0318 12:30:59.327252 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-lxkwk\" (UID: \"0542b387-d20c-41a1-81f3-1a11228e0a5c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" Mar 18 12:30:59 crc kubenswrapper[4975]: I0318 12:30:59.339203 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0542b387-d20c-41a1-81f3-1a11228e0a5c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-lxkwk\" (UID: \"0542b387-d20c-41a1-81f3-1a11228e0a5c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" Mar 18 12:30:59 crc kubenswrapper[4975]: I0318 12:30:59.360500 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" Mar 18 12:30:59 crc kubenswrapper[4975]: I0318 12:30:59.528859 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:59 crc kubenswrapper[4975]: I0318 12:30:59.528965 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:30:59 crc kubenswrapper[4975]: E0318 12:30:59.528984 4975 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 12:30:59 crc kubenswrapper[4975]: E0318 12:30:59.529046 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs podName:07e8604b-3e82-4a30-8f59-e240bd72d1a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:31:15.529026747 +0000 UTC m=+1261.243427326 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs") pod "openstack-operator-controller-manager-76c5949666-qk6mw" (UID: "07e8604b-3e82-4a30-8f59-e240bd72d1a3") : secret "metrics-server-cert" not found Mar 18 12:30:59 crc kubenswrapper[4975]: E0318 12:30:59.529073 4975 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 12:30:59 crc kubenswrapper[4975]: E0318 12:30:59.529107 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs podName:07e8604b-3e82-4a30-8f59-e240bd72d1a3 nodeName:}" failed. No retries permitted until 2026-03-18 12:31:15.529097429 +0000 UTC m=+1261.243498008 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs") pod "openstack-operator-controller-manager-76c5949666-qk6mw" (UID: "07e8604b-3e82-4a30-8f59-e240bd72d1a3") : secret "webhook-server-cert" not found Mar 18 12:30:59 crc kubenswrapper[4975]: E0318 12:30:59.676734 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e" Mar 18 12:30:59 crc kubenswrapper[4975]: E0318 12:30:59.677253 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t2tvr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-jlnvb_openstack-operators(b224d92b-1aed-47b5-8825-8f0b11da3092): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:30:59 crc kubenswrapper[4975]: E0318 12:30:59.678494 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jlnvb" podUID="b224d92b-1aed-47b5-8825-8f0b11da3092" Mar 18 12:30:59 crc kubenswrapper[4975]: E0318 12:30:59.727745 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jlnvb" podUID="b224d92b-1aed-47b5-8825-8f0b11da3092" Mar 18 12:30:59 crc kubenswrapper[4975]: E0318 12:30:59.729216 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jthrm" podUID="2c3688cf-2e2c-434c-88a7-10ac1a4949b2" Mar 18 12:31:00 crc kubenswrapper[4975]: E0318 12:31:00.144743 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d" Mar 18 12:31:00 crc kubenswrapper[4975]: E0318 12:31:00.144988 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5dkmp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-79df6bcc97-wjt2j_openstack-operators(1b122375-f65a-4f05-a738-41eab6a8fcd3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:31:00 crc kubenswrapper[4975]: E0318 12:31:00.146426 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wjt2j" podUID="1b122375-f65a-4f05-a738-41eab6a8fcd3" Mar 18 12:31:00 crc kubenswrapper[4975]: E0318 12:31:00.718421 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 18 12:31:00 crc kubenswrapper[4975]: E0318 12:31:00.718632 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zl7tp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-tsmhh_openstack-operators(3b4ea941-e8b1-47df-b33a-97dbf829cc24): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:31:00 crc kubenswrapper[4975]: E0318 12:31:00.720078 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tsmhh" podUID="3b4ea941-e8b1-47df-b33a-97dbf829cc24" Mar 18 12:31:00 crc kubenswrapper[4975]: E0318 12:31:00.756587 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tsmhh" podUID="3b4ea941-e8b1-47df-b33a-97dbf829cc24" Mar 18 12:31:00 crc kubenswrapper[4975]: E0318 12:31:00.756656 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d\\\"\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wjt2j" podUID="1b122375-f65a-4f05-a738-41eab6a8fcd3" Mar 18 12:31:00 crc kubenswrapper[4975]: I0318 12:31:00.981125 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk"] Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.740180 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-lcsc9" event={"ID":"13b9a062-4728-4ce6-8d1b-0206bb73684e","Type":"ContainerStarted","Data":"3e514262701994a4e24e97232865896ec66872172fb9abbc797e36c6f1b4dd84"} Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.740646 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-lcsc9" Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.742228 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-hqgd2" event={"ID":"4509daad-a22e-4801-891d-b0b8ea78ccb0","Type":"ContainerStarted","Data":"659867a8bd00f7133eb320ed56b71e63602135e371d0fff7a92fc7ed4867ba4a"} Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.742374 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-hqgd2" Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.743624 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7lzk9" event={"ID":"84bd8990-a50a-4fb2-88d3-e3141ef24b7d","Type":"ContainerStarted","Data":"6ba49dc6df9f42ad889ff88e62f8a141239a13e8f2e13f3f3783799b150870a5"} Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.744485 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7lzk9" Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.746046 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-m6rkz" event={"ID":"e48eb7b2-9ce8-465c-9f05-91b4c55b4867","Type":"ContainerStarted","Data":"cfe2be5ddf25412607b33a5d532e40f4eb2af3300f6505491e70a774fe23d39f"} Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.746183 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-m6rkz" Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.747375 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-dp5dh" event={"ID":"675b1757-67bf-4d6d-9947-31a4da13a1be","Type":"ContainerStarted","Data":"d64e34b6b27d8da3b70607507c7754451d040dc2e592a915214c41ab8e1b06fb"} Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.747491 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-dp5dh" Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.749049 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9hd99" event={"ID":"4b5bf68a-b7e7-4df7-ab69-2b8eac9c1091","Type":"ContainerStarted","Data":"b90a2a25fda123d80e5d044c9d79c0d8eafd7a0c6e64f826e52d70f2497ca3db"} Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.749170 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9hd99" Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.764719 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-lcsc9" podStartSLOduration=4.611456734 podStartE2EDuration="19.764697736s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.557472527 +0000 UTC m=+1231.271873106" lastFinishedPulling="2026-03-18 12:31:00.710713529 +0000 UTC m=+1246.425114108" observedRunningTime="2026-03-18 12:31:01.758424114 +0000 UTC m=+1247.472824693" watchObservedRunningTime="2026-03-18 12:31:01.764697736 +0000 UTC m=+1247.479098315" Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.772910 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v4ngr" event={"ID":"14e0c597-a515-4b44-908e-3737f385d7c3","Type":"ContainerStarted","Data":"3618d80389533f9bb3d24a07917707bffce58e8f0ce52e1de31fe77d22fd94b1"} Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.773499 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v4ngr" Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.791286 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9hd99" podStartSLOduration=4.415007977 podStartE2EDuration="19.791269823s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.339016258 +0000 UTC m=+1231.053416837" lastFinishedPulling="2026-03-18 12:31:00.715278104 +0000 UTC m=+1246.429678683" observedRunningTime="2026-03-18 12:31:01.789347181 +0000 UTC m=+1247.503747770" watchObservedRunningTime="2026-03-18 12:31:01.791269823 +0000 UTC m=+1247.505670392" Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.838277 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-hqgd2" podStartSLOduration=4.724395355 podStartE2EDuration="19.838255519s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.598317715 +0000 UTC m=+1231.312718294" lastFinishedPulling="2026-03-18 12:31:00.712177879 +0000 UTC m=+1246.426578458" observedRunningTime="2026-03-18 12:31:01.822240431 +0000 UTC m=+1247.536641020" watchObservedRunningTime="2026-03-18 12:31:01.838255519 +0000 UTC m=+1247.552656098" Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.841729 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-dp5dh" podStartSLOduration=4.491932273 podStartE2EDuration="19.841713814s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.361635327 +0000 UTC m=+1231.076035906" lastFinishedPulling="2026-03-18 12:31:00.711416868 +0000 UTC m=+1246.425817447" observedRunningTime="2026-03-18 12:31:01.833672184 +0000 UTC m=+1247.548072763" watchObservedRunningTime="2026-03-18 12:31:01.841713814 +0000 UTC m=+1247.556114393" Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.854842 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-m6rkz" podStartSLOduration=4.694388014 podStartE2EDuration="19.854799492s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.551793342 +0000 UTC m=+1231.266193921" lastFinishedPulling="2026-03-18 12:31:00.71220483 +0000 UTC m=+1246.426605399" observedRunningTime="2026-03-18 12:31:01.849779875 +0000 UTC m=+1247.564180454" watchObservedRunningTime="2026-03-18 12:31:01.854799492 +0000 UTC m=+1247.569200071" Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.905724 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7lzk9" podStartSLOduration=4.668869605 podStartE2EDuration="19.905700385s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.473881449 +0000 UTC m=+1231.188282028" lastFinishedPulling="2026-03-18 12:31:00.710712229 +0000 UTC m=+1246.425112808" observedRunningTime="2026-03-18 12:31:01.875621732 +0000 UTC m=+1247.590022311" watchObservedRunningTime="2026-03-18 12:31:01.905700385 +0000 UTC m=+1247.620100964" Mar 18 12:31:01 crc kubenswrapper[4975]: W0318 12:31:01.955033 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0542b387_d20c_41a1_81f3_1a11228e0a5c.slice/crio-ff6684a5337924ec1a31e8372d089363f9ff0dde598c98904e53cb0c6da7ddee WatchSource:0}: Error finding container ff6684a5337924ec1a31e8372d089363f9ff0dde598c98904e53cb0c6da7ddee: Status 404 returned error can't find the container with id ff6684a5337924ec1a31e8372d089363f9ff0dde598c98904e53cb0c6da7ddee Mar 18 12:31:01 crc kubenswrapper[4975]: I0318 12:31:01.956884 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:31:02 crc kubenswrapper[4975]: I0318 12:31:02.793056 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" event={"ID":"0542b387-d20c-41a1-81f3-1a11228e0a5c","Type":"ContainerStarted","Data":"ff6684a5337924ec1a31e8372d089363f9ff0dde598c98904e53cb0c6da7ddee"} Mar 18 12:31:03 crc kubenswrapper[4975]: I0318 12:31:03.804112 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7" event={"ID":"68cf7624-cb7b-45df-a18f-7dbfb9c20f6f","Type":"ContainerStarted","Data":"532da0421381e26f0e5860b7053b67b5c70249bd2da2e556ca771939dd5ecc44"} Mar 18 12:31:03 crc kubenswrapper[4975]: I0318 12:31:03.804358 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7" Mar 18 12:31:03 crc kubenswrapper[4975]: I0318 12:31:03.807652 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h" event={"ID":"475c84a8-c3d3-4bf8-91e0-244d5ffe1c9e","Type":"ContainerStarted","Data":"f57378fb7cf305421fa52f507ea9709e184816991328db3cdee58b886879ac33"} Mar 18 12:31:03 crc kubenswrapper[4975]: I0318 12:31:03.808322 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h" Mar 18 12:31:03 crc kubenswrapper[4975]: I0318 12:31:03.822476 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v4ngr" podStartSLOduration=6.519491936 podStartE2EDuration="21.822454285s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.413031924 +0000 UTC m=+1231.127432513" lastFinishedPulling="2026-03-18 12:31:00.715994273 +0000 UTC m=+1246.430394862" observedRunningTime="2026-03-18 12:31:01.905644474 +0000 UTC m=+1247.620045053" watchObservedRunningTime="2026-03-18 12:31:03.822454285 +0000 UTC m=+1249.536854864" Mar 18 12:31:03 crc kubenswrapper[4975]: I0318 12:31:03.823356 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7" podStartSLOduration=4.795404858 podStartE2EDuration="21.823348419s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.763138236 +0000 UTC m=+1231.477538825" lastFinishedPulling="2026-03-18 12:31:02.791081807 +0000 UTC m=+1248.505482386" observedRunningTime="2026-03-18 12:31:03.81897758 +0000 UTC m=+1249.533378169" watchObservedRunningTime="2026-03-18 12:31:03.823348419 +0000 UTC m=+1249.537748988" Mar 18 12:31:03 crc kubenswrapper[4975]: I0318 12:31:03.837550 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h" podStartSLOduration=4.906254582 podStartE2EDuration="21.837530927s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.8838569 +0000 UTC m=+1231.598257479" lastFinishedPulling="2026-03-18 12:31:02.815133245 +0000 UTC m=+1248.529533824" observedRunningTime="2026-03-18 12:31:03.833156258 +0000 UTC m=+1249.547556827" watchObservedRunningTime="2026-03-18 12:31:03.837530927 +0000 UTC m=+1249.551931506" Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.840113 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7" event={"ID":"37a66a0e-cc0b-4e2b-8b41-62c23ae539d9","Type":"ContainerStarted","Data":"14293093462a661d3616015c8dce7e39f4280ecac2bcde3d4da26a03d4eecf9d"} Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.840643 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7" Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.845851 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb" event={"ID":"aa02f9ee-8f7a-4880-b4c3-f2fcacd24967","Type":"ContainerStarted","Data":"1b0eee694d9499acb96b9ec58c045cdeb1c038e10a37f27c209680662c79c02e"} Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.846691 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb" Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.848820 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4" event={"ID":"fff94ef3-68c9-412e-adc7-c385a89a445f","Type":"ContainerStarted","Data":"a4483fed8042c9278e7fecaaaba883912f0fdf466628b61bfc31395c902e18db"} Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.849089 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4" Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.850422 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c285h" event={"ID":"fb3873ad-2d10-4df3-8198-2acd5c04b8c2","Type":"ContainerStarted","Data":"ffc138877c1c93474b78b92dc0ae53d41d42a8d7832d86754e49e49bec2196dd"} Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.852199 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" event={"ID":"0542b387-d20c-41a1-81f3-1a11228e0a5c","Type":"ContainerStarted","Data":"fc03ce0bdbfe3017383cdbb0e952caecaebe160decba8b96158094c082f471bc"} Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.852350 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.853258 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc" event={"ID":"7af89304-a07d-449c-9c16-97b829fa8290","Type":"ContainerStarted","Data":"f6f1be991744c37ede58df89270447237f5e9a3019743c830957b94c7a33bce2"} Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.853711 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc" Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.856396 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm" event={"ID":"fc5647e0-697f-490e-9413-9fb2e63b22d8","Type":"ContainerStarted","Data":"f3a7ea53046a92923142065ea2da28a4a6a5aa1bb78fa3ddff529251940dc41b"} Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.856804 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm" Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.887682 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7" podStartSLOduration=4.304007139 podStartE2EDuration="26.887659216s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.888567129 +0000 UTC m=+1231.602967708" lastFinishedPulling="2026-03-18 12:31:08.472219206 +0000 UTC m=+1254.186619785" observedRunningTime="2026-03-18 12:31:08.863394362 +0000 UTC m=+1254.577794941" watchObservedRunningTime="2026-03-18 12:31:08.887659216 +0000 UTC m=+1254.602059795" Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.889923 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm" podStartSLOduration=4.183169711 podStartE2EDuration="26.889910317s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.765587793 +0000 UTC m=+1231.479988372" lastFinishedPulling="2026-03-18 12:31:08.472328399 +0000 UTC m=+1254.186728978" observedRunningTime="2026-03-18 12:31:08.882483404 +0000 UTC m=+1254.596883983" watchObservedRunningTime="2026-03-18 12:31:08.889910317 +0000 UTC m=+1254.604310906" Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.908735 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-c285h" podStartSLOduration=3.223228708 podStartE2EDuration="25.908714682s" podCreationTimestamp="2026-03-18 12:30:43 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.868358296 +0000 UTC m=+1231.582758875" lastFinishedPulling="2026-03-18 12:31:08.55384427 +0000 UTC m=+1254.268244849" observedRunningTime="2026-03-18 12:31:08.902992996 +0000 UTC m=+1254.617393595" watchObservedRunningTime="2026-03-18 12:31:08.908714682 +0000 UTC m=+1254.623115271" Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.942444 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc" podStartSLOduration=4.233975082 podStartE2EDuration="26.942425425s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.763914807 +0000 UTC m=+1231.478315386" lastFinishedPulling="2026-03-18 12:31:08.47236515 +0000 UTC m=+1254.186765729" observedRunningTime="2026-03-18 12:31:08.938382854 +0000 UTC m=+1254.652783433" watchObservedRunningTime="2026-03-18 12:31:08.942425425 +0000 UTC m=+1254.656826014" Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.944884 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4" podStartSLOduration=3.31131884 podStartE2EDuration="25.944876012s" podCreationTimestamp="2026-03-18 12:30:43 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.880630862 +0000 UTC m=+1231.595031441" lastFinishedPulling="2026-03-18 12:31:08.514188034 +0000 UTC m=+1254.228588613" observedRunningTime="2026-03-18 12:31:08.922133059 +0000 UTC m=+1254.636533638" watchObservedRunningTime="2026-03-18 12:31:08.944876012 +0000 UTC m=+1254.659276591" Mar 18 12:31:08 crc kubenswrapper[4975]: I0318 12:31:08.954155 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb" podStartSLOduration=4.247211295 podStartE2EDuration="26.954137935s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.765398738 +0000 UTC m=+1231.479799317" lastFinishedPulling="2026-03-18 12:31:08.472325378 +0000 UTC m=+1254.186725957" observedRunningTime="2026-03-18 12:31:08.95064831 +0000 UTC m=+1254.665048889" watchObservedRunningTime="2026-03-18 12:31:08.954137935 +0000 UTC m=+1254.668538514" Mar 18 12:31:12 crc kubenswrapper[4975]: I0318 12:31:12.031259 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" podStartSLOduration=23.515576943 podStartE2EDuration="30.031239051s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:31:01.956611639 +0000 UTC m=+1247.671012218" lastFinishedPulling="2026-03-18 12:31:08.472273747 +0000 UTC m=+1254.186674326" observedRunningTime="2026-03-18 12:31:08.979346015 +0000 UTC m=+1254.693746614" watchObservedRunningTime="2026-03-18 12:31:12.031239051 +0000 UTC m=+1257.745639640" Mar 18 12:31:12 crc kubenswrapper[4975]: I0318 12:31:12.885466 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jthrm" event={"ID":"2c3688cf-2e2c-434c-88a7-10ac1a4949b2","Type":"ContainerStarted","Data":"e318734cd5a0729b2b4602954ffcc6fae94974ad6f25710bfc81093ee7007a69"} Mar 18 12:31:12 crc kubenswrapper[4975]: I0318 12:31:12.885774 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jthrm" Mar 18 12:31:12 crc kubenswrapper[4975]: I0318 12:31:12.888063 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9hd99" Mar 18 12:31:12 crc kubenswrapper[4975]: I0318 12:31:12.923194 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7lzk9" Mar 18 12:31:12 crc kubenswrapper[4975]: I0318 12:31:12.926981 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jthrm" podStartSLOduration=3.8144752410000002 podStartE2EDuration="30.926959866s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.551297698 +0000 UTC m=+1231.265698277" lastFinishedPulling="2026-03-18 12:31:12.663782323 +0000 UTC m=+1258.378182902" observedRunningTime="2026-03-18 12:31:12.903201156 +0000 UTC m=+1258.617601735" watchObservedRunningTime="2026-03-18 12:31:12.926959866 +0000 UTC m=+1258.641360465" Mar 18 12:31:12 crc kubenswrapper[4975]: I0318 12:31:12.933080 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-dp5dh" Mar 18 12:31:12 crc kubenswrapper[4975]: I0318 12:31:12.993132 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v4ngr" Mar 18 12:31:13 crc kubenswrapper[4975]: I0318 12:31:13.251681 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-hqgd2" Mar 18 12:31:13 crc kubenswrapper[4975]: I0318 12:31:13.507154 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-lcsc9" Mar 18 12:31:13 crc kubenswrapper[4975]: I0318 12:31:13.551153 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-mwbb7" Mar 18 12:31:13 crc kubenswrapper[4975]: I0318 12:31:13.591118 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-9bhc7" Mar 18 12:31:13 crc kubenswrapper[4975]: I0318 12:31:13.642991 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-d7vwb" Mar 18 12:31:13 crc kubenswrapper[4975]: I0318 12:31:13.651712 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-m6rkz" Mar 18 12:31:13 crc kubenswrapper[4975]: I0318 12:31:13.653033 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-kmlsc" Mar 18 12:31:13 crc kubenswrapper[4975]: I0318 12:31:13.829762 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-xfr6h" Mar 18 12:31:13 crc kubenswrapper[4975]: I0318 12:31:13.856534 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-d2wsm" Mar 18 12:31:13 crc kubenswrapper[4975]: I0318 12:31:13.895143 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jlnvb" event={"ID":"b224d92b-1aed-47b5-8825-8f0b11da3092","Type":"ContainerStarted","Data":"0b9284380d85c8e62f259aff493d233a0147aa309057d68bc7a6e10e4b4eeabf"} Mar 18 12:31:13 crc kubenswrapper[4975]: I0318 12:31:13.895673 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jlnvb" Mar 18 12:31:13 crc kubenswrapper[4975]: I0318 12:31:13.916262 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jlnvb" podStartSLOduration=4.012401449 podStartE2EDuration="31.916244481s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.575582663 +0000 UTC m=+1231.289983242" lastFinishedPulling="2026-03-18 12:31:13.479425685 +0000 UTC m=+1259.193826274" observedRunningTime="2026-03-18 12:31:13.909537257 +0000 UTC m=+1259.623937846" watchObservedRunningTime="2026-03-18 12:31:13.916244481 +0000 UTC m=+1259.630645060" Mar 18 12:31:14 crc kubenswrapper[4975]: I0318 12:31:14.248488 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-btfw4" Mar 18 12:31:14 crc kubenswrapper[4975]: I0318 12:31:14.874149 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert\") pod \"infra-operator-controller-manager-7b9c774f96-tfkfx\" (UID: \"d854c129-c4eb-4c08-a398-3549f4ff9047\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:31:14 crc kubenswrapper[4975]: I0318 12:31:14.882058 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d854c129-c4eb-4c08-a398-3549f4ff9047-cert\") pod \"infra-operator-controller-manager-7b9c774f96-tfkfx\" (UID: \"d854c129-c4eb-4c08-a398-3549f4ff9047\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:31:14 crc kubenswrapper[4975]: I0318 12:31:14.905502 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wjt2j" event={"ID":"1b122375-f65a-4f05-a738-41eab6a8fcd3","Type":"ContainerStarted","Data":"b77107f8ade8e80c60f621572225234784e183b3f7131f40d8ae293f4ee6d4f7"} Mar 18 12:31:14 crc kubenswrapper[4975]: I0318 12:31:14.905695 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wjt2j" Mar 18 12:31:14 crc kubenswrapper[4975]: I0318 12:31:14.923666 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wjt2j" podStartSLOduration=3.765998405 podStartE2EDuration="32.923642963s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.352344873 +0000 UTC m=+1231.066745452" lastFinishedPulling="2026-03-18 12:31:14.509989431 +0000 UTC m=+1260.224390010" observedRunningTime="2026-03-18 12:31:14.920019684 +0000 UTC m=+1260.634420273" watchObservedRunningTime="2026-03-18 12:31:14.923642963 +0000 UTC m=+1260.638043542" Mar 18 12:31:15 crc kubenswrapper[4975]: I0318 12:31:15.153064 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-ttk54" Mar 18 12:31:15 crc kubenswrapper[4975]: I0318 12:31:15.161789 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:31:15 crc kubenswrapper[4975]: I0318 12:31:15.562927 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx"] Mar 18 12:31:15 crc kubenswrapper[4975]: I0318 12:31:15.584566 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:31:15 crc kubenswrapper[4975]: I0318 12:31:15.584642 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:31:15 crc kubenswrapper[4975]: I0318 12:31:15.589676 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-metrics-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:31:15 crc kubenswrapper[4975]: I0318 12:31:15.589732 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/07e8604b-3e82-4a30-8f59-e240bd72d1a3-webhook-certs\") pod \"openstack-operator-controller-manager-76c5949666-qk6mw\" (UID: \"07e8604b-3e82-4a30-8f59-e240bd72d1a3\") " pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:31:15 crc kubenswrapper[4975]: I0318 12:31:15.725197 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-9m7jc" Mar 18 12:31:15 crc kubenswrapper[4975]: I0318 12:31:15.733525 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:31:15 crc kubenswrapper[4975]: I0318 12:31:15.924474 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" event={"ID":"d854c129-c4eb-4c08-a398-3549f4ff9047","Type":"ContainerStarted","Data":"4f203f725f8b7c78e17c59e9ef2f3acd93ba307a3d83f5050625fd1b7bed312e"} Mar 18 12:31:16 crc kubenswrapper[4975]: I0318 12:31:16.205697 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw"] Mar 18 12:31:16 crc kubenswrapper[4975]: I0318 12:31:16.930657 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" event={"ID":"07e8604b-3e82-4a30-8f59-e240bd72d1a3","Type":"ContainerStarted","Data":"44a3b617feb31a2cc9eab90066eb9114e79ac7252d9d2040c525936048fe0ff3"} Mar 18 12:31:19 crc kubenswrapper[4975]: I0318 12:31:19.373331 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-lxkwk" Mar 18 12:31:22 crc kubenswrapper[4975]: I0318 12:31:22.971643 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-wjt2j" Mar 18 12:31:23 crc kubenswrapper[4975]: I0318 12:31:23.062720 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jthrm" Mar 18 12:31:23 crc kubenswrapper[4975]: I0318 12:31:23.797327 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-jlnvb" Mar 18 12:31:24 crc kubenswrapper[4975]: I0318 12:31:24.389580 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" event={"ID":"07e8604b-3e82-4a30-8f59-e240bd72d1a3","Type":"ContainerStarted","Data":"a5462d2a7088543309025102f8b4de6bf4a6f5f8489c0669ea7c58911219c5ac"} Mar 18 12:31:24 crc kubenswrapper[4975]: I0318 12:31:24.389922 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:31:24 crc kubenswrapper[4975]: I0318 12:31:24.413502 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" podStartSLOduration=41.413481321 podStartE2EDuration="41.413481321s" podCreationTimestamp="2026-03-18 12:30:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:31:24.410762336 +0000 UTC m=+1270.125162925" watchObservedRunningTime="2026-03-18 12:31:24.413481321 +0000 UTC m=+1270.127881900" Mar 18 12:31:26 crc kubenswrapper[4975]: I0318 12:31:26.409694 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-qhnk6" event={"ID":"dc261851-abff-4a1e-b20e-07d9c3bea942","Type":"ContainerStarted","Data":"8a9e8d850f25fa12c27d5558b8eb26c5a1af5a01d092a4f3e4ba4c565ac08460"} Mar 18 12:31:26 crc kubenswrapper[4975]: I0318 12:31:26.410257 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-qhnk6" Mar 18 12:31:26 crc kubenswrapper[4975]: I0318 12:31:26.411621 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tsmhh" event={"ID":"3b4ea941-e8b1-47df-b33a-97dbf829cc24","Type":"ContainerStarted","Data":"35899842219b4581b55e6aebe842b84075ac1643b2dae8ef7e95de061d3d296c"} Mar 18 12:31:26 crc kubenswrapper[4975]: I0318 12:31:26.411843 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tsmhh" Mar 18 12:31:26 crc kubenswrapper[4975]: I0318 12:31:26.413398 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" event={"ID":"d854c129-c4eb-4c08-a398-3549f4ff9047","Type":"ContainerStarted","Data":"e58de584fef72ca348fb1c08070449734cc963c150f2ed2e849c76911b840b6a"} Mar 18 12:31:26 crc kubenswrapper[4975]: I0318 12:31:26.413539 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:31:26 crc kubenswrapper[4975]: I0318 12:31:26.427848 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-qhnk6" podStartSLOduration=4.736050064 podStartE2EDuration="44.427826922s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.576422396 +0000 UTC m=+1231.290822975" lastFinishedPulling="2026-03-18 12:31:25.268199254 +0000 UTC m=+1270.982599833" observedRunningTime="2026-03-18 12:31:26.427197295 +0000 UTC m=+1272.141597894" watchObservedRunningTime="2026-03-18 12:31:26.427826922 +0000 UTC m=+1272.142227511" Mar 18 12:31:26 crc kubenswrapper[4975]: I0318 12:31:26.466382 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tsmhh" podStartSLOduration=4.9376316209999995 podStartE2EDuration="44.466355787s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:30:45.739066327 +0000 UTC m=+1231.453466906" lastFinishedPulling="2026-03-18 12:31:25.267790483 +0000 UTC m=+1270.982191072" observedRunningTime="2026-03-18 12:31:26.448933 +0000 UTC m=+1272.163333579" watchObservedRunningTime="2026-03-18 12:31:26.466355787 +0000 UTC m=+1272.180756386" Mar 18 12:31:26 crc kubenswrapper[4975]: I0318 12:31:26.466951 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" podStartSLOduration=34.754085782 podStartE2EDuration="44.466941263s" podCreationTimestamp="2026-03-18 12:30:42 +0000 UTC" firstStartedPulling="2026-03-18 12:31:15.57753678 +0000 UTC m=+1261.291937359" lastFinishedPulling="2026-03-18 12:31:25.290392261 +0000 UTC m=+1271.004792840" observedRunningTime="2026-03-18 12:31:26.464231609 +0000 UTC m=+1272.178632198" watchObservedRunningTime="2026-03-18 12:31:26.466941263 +0000 UTC m=+1272.181341852" Mar 18 12:31:33 crc kubenswrapper[4975]: I0318 12:31:33.560014 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-qhnk6" Mar 18 12:31:33 crc kubenswrapper[4975]: I0318 12:31:33.627690 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tsmhh" Mar 18 12:31:35 crc kubenswrapper[4975]: I0318 12:31:35.168703 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-tfkfx" Mar 18 12:31:35 crc kubenswrapper[4975]: I0318 12:31:35.739649 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-76c5949666-qk6mw" Mar 18 12:31:55 crc kubenswrapper[4975]: I0318 12:31:55.808453 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-km58k"] Mar 18 12:31:55 crc kubenswrapper[4975]: I0318 12:31:55.825340 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-km58k" Mar 18 12:31:55 crc kubenswrapper[4975]: I0318 12:31:55.836507 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 12:31:55 crc kubenswrapper[4975]: I0318 12:31:55.836720 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 12:31:55 crc kubenswrapper[4975]: I0318 12:31:55.836961 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-w6nrv" Mar 18 12:31:55 crc kubenswrapper[4975]: I0318 12:31:55.837106 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 12:31:55 crc kubenswrapper[4975]: I0318 12:31:55.846057 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-km58k"] Mar 18 12:31:55 crc kubenswrapper[4975]: I0318 12:31:55.947376 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dm8xw"] Mar 18 12:31:55 crc kubenswrapper[4975]: I0318 12:31:55.949559 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cea02ea0-10d7-4b65-8fe5-64efd8c625c9-config\") pod \"dnsmasq-dns-675f4bcbfc-km58k\" (UID: \"cea02ea0-10d7-4b65-8fe5-64efd8c625c9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-km58k" Mar 18 12:31:55 crc kubenswrapper[4975]: I0318 12:31:55.949615 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vgmp\" (UniqueName: \"kubernetes.io/projected/cea02ea0-10d7-4b65-8fe5-64efd8c625c9-kube-api-access-6vgmp\") pod \"dnsmasq-dns-675f4bcbfc-km58k\" (UID: \"cea02ea0-10d7-4b65-8fe5-64efd8c625c9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-km58k" Mar 18 12:31:55 crc kubenswrapper[4975]: I0318 12:31:55.957719 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" Mar 18 12:31:55 crc kubenswrapper[4975]: I0318 12:31:55.960686 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 12:31:55 crc kubenswrapper[4975]: I0318 12:31:55.980952 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dm8xw"] Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.050934 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0c86030-ed6d-4757-90ab-0716d93d4513-config\") pod \"dnsmasq-dns-78dd6ddcc-dm8xw\" (UID: \"e0c86030-ed6d-4757-90ab-0716d93d4513\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.051015 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cea02ea0-10d7-4b65-8fe5-64efd8c625c9-config\") pod \"dnsmasq-dns-675f4bcbfc-km58k\" (UID: \"cea02ea0-10d7-4b65-8fe5-64efd8c625c9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-km58k" Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.051140 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0c86030-ed6d-4757-90ab-0716d93d4513-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-dm8xw\" (UID: \"e0c86030-ed6d-4757-90ab-0716d93d4513\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.051209 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vgmp\" (UniqueName: \"kubernetes.io/projected/cea02ea0-10d7-4b65-8fe5-64efd8c625c9-kube-api-access-6vgmp\") pod \"dnsmasq-dns-675f4bcbfc-km58k\" (UID: \"cea02ea0-10d7-4b65-8fe5-64efd8c625c9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-km58k" Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.051341 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrdzs\" (UniqueName: \"kubernetes.io/projected/e0c86030-ed6d-4757-90ab-0716d93d4513-kube-api-access-hrdzs\") pod \"dnsmasq-dns-78dd6ddcc-dm8xw\" (UID: \"e0c86030-ed6d-4757-90ab-0716d93d4513\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.052300 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cea02ea0-10d7-4b65-8fe5-64efd8c625c9-config\") pod \"dnsmasq-dns-675f4bcbfc-km58k\" (UID: \"cea02ea0-10d7-4b65-8fe5-64efd8c625c9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-km58k" Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.086598 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vgmp\" (UniqueName: \"kubernetes.io/projected/cea02ea0-10d7-4b65-8fe5-64efd8c625c9-kube-api-access-6vgmp\") pod \"dnsmasq-dns-675f4bcbfc-km58k\" (UID: \"cea02ea0-10d7-4b65-8fe5-64efd8c625c9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-km58k" Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.152932 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrdzs\" (UniqueName: \"kubernetes.io/projected/e0c86030-ed6d-4757-90ab-0716d93d4513-kube-api-access-hrdzs\") pod \"dnsmasq-dns-78dd6ddcc-dm8xw\" (UID: \"e0c86030-ed6d-4757-90ab-0716d93d4513\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.153087 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0c86030-ed6d-4757-90ab-0716d93d4513-config\") pod \"dnsmasq-dns-78dd6ddcc-dm8xw\" (UID: \"e0c86030-ed6d-4757-90ab-0716d93d4513\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.153191 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0c86030-ed6d-4757-90ab-0716d93d4513-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-dm8xw\" (UID: \"e0c86030-ed6d-4757-90ab-0716d93d4513\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.154258 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0c86030-ed6d-4757-90ab-0716d93d4513-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-dm8xw\" (UID: \"e0c86030-ed6d-4757-90ab-0716d93d4513\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.155024 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0c86030-ed6d-4757-90ab-0716d93d4513-config\") pod \"dnsmasq-dns-78dd6ddcc-dm8xw\" (UID: \"e0c86030-ed6d-4757-90ab-0716d93d4513\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.168406 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-km58k" Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.179145 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrdzs\" (UniqueName: \"kubernetes.io/projected/e0c86030-ed6d-4757-90ab-0716d93d4513-kube-api-access-hrdzs\") pod \"dnsmasq-dns-78dd6ddcc-dm8xw\" (UID: \"e0c86030-ed6d-4757-90ab-0716d93d4513\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.286476 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.655578 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-km58k"] Mar 18 12:31:56 crc kubenswrapper[4975]: I0318 12:31:56.784807 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dm8xw"] Mar 18 12:31:56 crc kubenswrapper[4975]: W0318 12:31:56.785742 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0c86030_ed6d_4757_90ab_0716d93d4513.slice/crio-c8966b5c0726f5f199f2656508cc67fcf7b9478f295688c74981891e4b17570a WatchSource:0}: Error finding container c8966b5c0726f5f199f2656508cc67fcf7b9478f295688c74981891e4b17570a: Status 404 returned error can't find the container with id c8966b5c0726f5f199f2656508cc67fcf7b9478f295688c74981891e4b17570a Mar 18 12:31:57 crc kubenswrapper[4975]: I0318 12:31:57.660713 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" event={"ID":"e0c86030-ed6d-4757-90ab-0716d93d4513","Type":"ContainerStarted","Data":"c8966b5c0726f5f199f2656508cc67fcf7b9478f295688c74981891e4b17570a"} Mar 18 12:31:57 crc kubenswrapper[4975]: I0318 12:31:57.663525 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-km58k" event={"ID":"cea02ea0-10d7-4b65-8fe5-64efd8c625c9","Type":"ContainerStarted","Data":"712966b1b6f9235ac5504851d9c0683c15a0adbd4a8ed3382d6b888c25fe7b5d"} Mar 18 12:31:58 crc kubenswrapper[4975]: I0318 12:31:58.825519 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-km58k"] Mar 18 12:31:58 crc kubenswrapper[4975]: I0318 12:31:58.850465 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-xmlfk"] Mar 18 12:31:58 crc kubenswrapper[4975]: I0318 12:31:58.851766 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" Mar 18 12:31:58 crc kubenswrapper[4975]: I0318 12:31:58.867118 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-xmlfk"] Mar 18 12:31:58 crc kubenswrapper[4975]: I0318 12:31:58.934911 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490b6ddf-67eb-4b14-bce3-c07798f6b03a-config\") pod \"dnsmasq-dns-5ccc8479f9-xmlfk\" (UID: \"490b6ddf-67eb-4b14-bce3-c07798f6b03a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" Mar 18 12:31:58 crc kubenswrapper[4975]: I0318 12:31:58.935010 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/490b6ddf-67eb-4b14-bce3-c07798f6b03a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-xmlfk\" (UID: \"490b6ddf-67eb-4b14-bce3-c07798f6b03a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" Mar 18 12:31:58 crc kubenswrapper[4975]: I0318 12:31:58.936326 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws99j\" (UniqueName: \"kubernetes.io/projected/490b6ddf-67eb-4b14-bce3-c07798f6b03a-kube-api-access-ws99j\") pod \"dnsmasq-dns-5ccc8479f9-xmlfk\" (UID: \"490b6ddf-67eb-4b14-bce3-c07798f6b03a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.037940 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490b6ddf-67eb-4b14-bce3-c07798f6b03a-config\") pod \"dnsmasq-dns-5ccc8479f9-xmlfk\" (UID: \"490b6ddf-67eb-4b14-bce3-c07798f6b03a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.038006 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/490b6ddf-67eb-4b14-bce3-c07798f6b03a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-xmlfk\" (UID: \"490b6ddf-67eb-4b14-bce3-c07798f6b03a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.038117 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws99j\" (UniqueName: \"kubernetes.io/projected/490b6ddf-67eb-4b14-bce3-c07798f6b03a-kube-api-access-ws99j\") pod \"dnsmasq-dns-5ccc8479f9-xmlfk\" (UID: \"490b6ddf-67eb-4b14-bce3-c07798f6b03a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.039142 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490b6ddf-67eb-4b14-bce3-c07798f6b03a-config\") pod \"dnsmasq-dns-5ccc8479f9-xmlfk\" (UID: \"490b6ddf-67eb-4b14-bce3-c07798f6b03a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.039446 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/490b6ddf-67eb-4b14-bce3-c07798f6b03a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-xmlfk\" (UID: \"490b6ddf-67eb-4b14-bce3-c07798f6b03a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.067088 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws99j\" (UniqueName: \"kubernetes.io/projected/490b6ddf-67eb-4b14-bce3-c07798f6b03a-kube-api-access-ws99j\") pod \"dnsmasq-dns-5ccc8479f9-xmlfk\" (UID: \"490b6ddf-67eb-4b14-bce3-c07798f6b03a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.186915 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.212684 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dm8xw"] Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.254531 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jp47c"] Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.263715 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.279901 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jp47c"] Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.345592 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e52b66d-395e-4753-9c2b-b271c2dedf36-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jp47c\" (UID: \"2e52b66d-395e-4753-9c2b-b271c2dedf36\") " pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.345993 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e52b66d-395e-4753-9c2b-b271c2dedf36-config\") pod \"dnsmasq-dns-57d769cc4f-jp47c\" (UID: \"2e52b66d-395e-4753-9c2b-b271c2dedf36\") " pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.346037 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xzft\" (UniqueName: \"kubernetes.io/projected/2e52b66d-395e-4753-9c2b-b271c2dedf36-kube-api-access-5xzft\") pod \"dnsmasq-dns-57d769cc4f-jp47c\" (UID: \"2e52b66d-395e-4753-9c2b-b271c2dedf36\") " pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.451186 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e52b66d-395e-4753-9c2b-b271c2dedf36-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jp47c\" (UID: \"2e52b66d-395e-4753-9c2b-b271c2dedf36\") " pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.451232 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e52b66d-395e-4753-9c2b-b271c2dedf36-config\") pod \"dnsmasq-dns-57d769cc4f-jp47c\" (UID: \"2e52b66d-395e-4753-9c2b-b271c2dedf36\") " pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.451265 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xzft\" (UniqueName: \"kubernetes.io/projected/2e52b66d-395e-4753-9c2b-b271c2dedf36-kube-api-access-5xzft\") pod \"dnsmasq-dns-57d769cc4f-jp47c\" (UID: \"2e52b66d-395e-4753-9c2b-b271c2dedf36\") " pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.452353 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e52b66d-395e-4753-9c2b-b271c2dedf36-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jp47c\" (UID: \"2e52b66d-395e-4753-9c2b-b271c2dedf36\") " pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.453281 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e52b66d-395e-4753-9c2b-b271c2dedf36-config\") pod \"dnsmasq-dns-57d769cc4f-jp47c\" (UID: \"2e52b66d-395e-4753-9c2b-b271c2dedf36\") " pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.485479 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xzft\" (UniqueName: \"kubernetes.io/projected/2e52b66d-395e-4753-9c2b-b271c2dedf36-kube-api-access-5xzft\") pod \"dnsmasq-dns-57d769cc4f-jp47c\" (UID: \"2e52b66d-395e-4753-9c2b-b271c2dedf36\") " pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.634542 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.638722 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-xmlfk"] Mar 18 12:31:59 crc kubenswrapper[4975]: W0318 12:31:59.668505 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod490b6ddf_67eb_4b14_bce3_c07798f6b03a.slice/crio-f062420ec67dbf9958a699aea1597b7116418333d969725d004cc7332e6322c8 WatchSource:0}: Error finding container f062420ec67dbf9958a699aea1597b7116418333d969725d004cc7332e6322c8: Status 404 returned error can't find the container with id f062420ec67dbf9958a699aea1597b7116418333d969725d004cc7332e6322c8 Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.688200 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" event={"ID":"490b6ddf-67eb-4b14-bce3-c07798f6b03a","Type":"ContainerStarted","Data":"f062420ec67dbf9958a699aea1597b7116418333d969725d004cc7332e6322c8"} Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.966012 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.967436 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.977622 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.977905 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.978021 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.978150 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.978566 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.978767 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hmhxg" Mar 18 12:31:59 crc kubenswrapper[4975]: I0318 12:31:59.978948 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.002070 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.069346 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.069455 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a68f98b5-0226-4b20-a767-ead5e0af066e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.069501 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.069530 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjsnm\" (UniqueName: \"kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-kube-api-access-jjsnm\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.069593 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.069629 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.069679 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.069731 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.069770 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.069855 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a68f98b5-0226-4b20-a767-ead5e0af066e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.070002 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.087889 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jp47c"] Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.139925 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563952-vwzs4"] Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.144201 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563952-vwzs4"] Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.144306 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563952-vwzs4" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.147354 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.147914 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.147922 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.171181 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xl66\" (UniqueName: \"kubernetes.io/projected/23a3a6b1-d7fd-4d05-a075-19aa9585b87d-kube-api-access-7xl66\") pod \"auto-csr-approver-29563952-vwzs4\" (UID: \"23a3a6b1-d7fd-4d05-a075-19aa9585b87d\") " pod="openshift-infra/auto-csr-approver-29563952-vwzs4" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.171249 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a68f98b5-0226-4b20-a767-ead5e0af066e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.171285 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.171330 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.171362 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a68f98b5-0226-4b20-a767-ead5e0af066e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.171394 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.171417 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjsnm\" (UniqueName: \"kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-kube-api-access-jjsnm\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.171454 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.171499 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.171537 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.171581 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.171613 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.172965 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.174065 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.174216 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.175043 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.176558 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.177529 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.179977 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a68f98b5-0226-4b20-a767-ead5e0af066e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.180350 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.180980 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.188932 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a68f98b5-0226-4b20-a767-ead5e0af066e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.192477 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjsnm\" (UniqueName: \"kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-kube-api-access-jjsnm\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.202395 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.273441 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xl66\" (UniqueName: \"kubernetes.io/projected/23a3a6b1-d7fd-4d05-a075-19aa9585b87d-kube-api-access-7xl66\") pod \"auto-csr-approver-29563952-vwzs4\" (UID: \"23a3a6b1-d7fd-4d05-a075-19aa9585b87d\") " pod="openshift-infra/auto-csr-approver-29563952-vwzs4" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.305447 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xl66\" (UniqueName: \"kubernetes.io/projected/23a3a6b1-d7fd-4d05-a075-19aa9585b87d-kube-api-access-7xl66\") pod \"auto-csr-approver-29563952-vwzs4\" (UID: \"23a3a6b1-d7fd-4d05-a075-19aa9585b87d\") " pod="openshift-infra/auto-csr-approver-29563952-vwzs4" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.318251 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.446400 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.447731 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.452478 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.452844 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.453025 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.453190 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.453432 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.453644 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.454456 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lmpq8" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.461301 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.473032 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563952-vwzs4" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.477174 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.477364 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.477398 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.477429 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-config-data\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.477445 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.477461 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.477483 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w8tw\" (UniqueName: \"kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-kube-api-access-7w8tw\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.477506 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.477530 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.477559 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.477633 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.578671 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.578738 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.578769 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.578819 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-config-data\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.578837 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.578854 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.578901 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w8tw\" (UniqueName: \"kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-kube-api-access-7w8tw\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.578920 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.578938 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.579000 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.579055 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.579086 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.579788 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.580193 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.580278 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.580790 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-config-data\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.581364 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.585698 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.585879 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.598390 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w8tw\" (UniqueName: \"kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-kube-api-access-7w8tw\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.600165 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.605722 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.611137 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " pod="openstack/rabbitmq-server-0" Mar 18 12:32:00 crc kubenswrapper[4975]: I0318 12:32:00.779359 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.508276 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.514560 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.522374 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.583194 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-xfqgk" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.585710 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.586013 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.586221 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.587744 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.707849 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcac25a5-0552-488f-b468-ffea7a442115-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.708065 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bcac25a5-0552-488f-b468-ffea7a442115-kolla-config\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.708241 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bcac25a5-0552-488f-b468-ffea7a442115-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.708307 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.708398 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcac25a5-0552-488f-b468-ffea7a442115-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.708441 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcac25a5-0552-488f-b468-ffea7a442115-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.708475 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bcac25a5-0552-488f-b468-ffea7a442115-config-data-default\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.708507 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7rzv\" (UniqueName: \"kubernetes.io/projected/bcac25a5-0552-488f-b468-ffea7a442115-kube-api-access-n7rzv\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.810154 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bcac25a5-0552-488f-b468-ffea7a442115-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.810222 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.810261 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcac25a5-0552-488f-b468-ffea7a442115-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.810284 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcac25a5-0552-488f-b468-ffea7a442115-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.810302 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bcac25a5-0552-488f-b468-ffea7a442115-config-data-default\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.810320 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7rzv\" (UniqueName: \"kubernetes.io/projected/bcac25a5-0552-488f-b468-ffea7a442115-kube-api-access-n7rzv\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.810340 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcac25a5-0552-488f-b468-ffea7a442115-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.810369 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bcac25a5-0552-488f-b468-ffea7a442115-kolla-config\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.811104 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bcac25a5-0552-488f-b468-ffea7a442115-kolla-config\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.811180 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.811315 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bcac25a5-0552-488f-b468-ffea7a442115-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.812017 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bcac25a5-0552-488f-b468-ffea7a442115-config-data-default\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.812547 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcac25a5-0552-488f-b468-ffea7a442115-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.823535 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcac25a5-0552-488f-b468-ffea7a442115-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.829132 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcac25a5-0552-488f-b468-ffea7a442115-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.838113 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.855802 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7rzv\" (UniqueName: \"kubernetes.io/projected/bcac25a5-0552-488f-b468-ffea7a442115-kube-api-access-n7rzv\") pod \"openstack-galera-0\" (UID: \"bcac25a5-0552-488f-b468-ffea7a442115\") " pod="openstack/openstack-galera-0" Mar 18 12:32:01 crc kubenswrapper[4975]: I0318 12:32:01.902473 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 12:32:02 crc kubenswrapper[4975]: I0318 12:32:02.805553 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 12:32:02 crc kubenswrapper[4975]: I0318 12:32:02.808222 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:02 crc kubenswrapper[4975]: I0318 12:32:02.810406 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-sr8lg" Mar 18 12:32:02 crc kubenswrapper[4975]: I0318 12:32:02.811182 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 12:32:02 crc kubenswrapper[4975]: I0318 12:32:02.812426 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 12:32:02 crc kubenswrapper[4975]: I0318 12:32:02.812562 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 12:32:02 crc kubenswrapper[4975]: I0318 12:32:02.818766 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 12:32:02 crc kubenswrapper[4975]: I0318 12:32:02.926299 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52cabd28-7357-4e96-b812-637660ce5cd1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:02 crc kubenswrapper[4975]: I0318 12:32:02.926369 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/52cabd28-7357-4e96-b812-637660ce5cd1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:02 crc kubenswrapper[4975]: I0318 12:32:02.926397 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/52cabd28-7357-4e96-b812-637660ce5cd1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:02 crc kubenswrapper[4975]: I0318 12:32:02.926438 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cabd28-7357-4e96-b812-637660ce5cd1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:02 crc kubenswrapper[4975]: I0318 12:32:02.926586 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52cabd28-7357-4e96-b812-637660ce5cd1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:02 crc kubenswrapper[4975]: I0318 12:32:02.926632 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:02 crc kubenswrapper[4975]: I0318 12:32:02.926690 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/52cabd28-7357-4e96-b812-637660ce5cd1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:02 crc kubenswrapper[4975]: I0318 12:32:02.926774 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mtm2\" (UniqueName: \"kubernetes.io/projected/52cabd28-7357-4e96-b812-637660ce5cd1-kube-api-access-7mtm2\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.027590 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/52cabd28-7357-4e96-b812-637660ce5cd1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.027627 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/52cabd28-7357-4e96-b812-637660ce5cd1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.027660 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cabd28-7357-4e96-b812-637660ce5cd1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.027700 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52cabd28-7357-4e96-b812-637660ce5cd1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.027734 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.027776 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/52cabd28-7357-4e96-b812-637660ce5cd1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.027825 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mtm2\" (UniqueName: \"kubernetes.io/projected/52cabd28-7357-4e96-b812-637660ce5cd1-kube-api-access-7mtm2\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.027851 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52cabd28-7357-4e96-b812-637660ce5cd1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.028097 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/52cabd28-7357-4e96-b812-637660ce5cd1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.028178 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.029190 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/52cabd28-7357-4e96-b812-637660ce5cd1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.029400 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52cabd28-7357-4e96-b812-637660ce5cd1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.029923 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52cabd28-7357-4e96-b812-637660ce5cd1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.032765 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/52cabd28-7357-4e96-b812-637660ce5cd1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.041993 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.043370 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.046106 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-2bzsz" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.046238 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cabd28-7357-4e96-b812-637660ce5cd1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.046327 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.052087 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.059701 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.066600 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.070702 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mtm2\" (UniqueName: \"kubernetes.io/projected/52cabd28-7357-4e96-b812-637660ce5cd1-kube-api-access-7mtm2\") pod \"openstack-cell1-galera-0\" (UID: \"52cabd28-7357-4e96-b812-637660ce5cd1\") " pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.129171 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56b76c27-2e96-46ad-a648-705b85b28bd6-config-data\") pod \"memcached-0\" (UID: \"56b76c27-2e96-46ad-a648-705b85b28bd6\") " pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.129234 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6blb4\" (UniqueName: \"kubernetes.io/projected/56b76c27-2e96-46ad-a648-705b85b28bd6-kube-api-access-6blb4\") pod \"memcached-0\" (UID: \"56b76c27-2e96-46ad-a648-705b85b28bd6\") " pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.129256 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b76c27-2e96-46ad-a648-705b85b28bd6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"56b76c27-2e96-46ad-a648-705b85b28bd6\") " pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.129318 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56b76c27-2e96-46ad-a648-705b85b28bd6-kolla-config\") pod \"memcached-0\" (UID: \"56b76c27-2e96-46ad-a648-705b85b28bd6\") " pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.129342 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b76c27-2e96-46ad-a648-705b85b28bd6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"56b76c27-2e96-46ad-a648-705b85b28bd6\") " pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.176070 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.231221 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56b76c27-2e96-46ad-a648-705b85b28bd6-config-data\") pod \"memcached-0\" (UID: \"56b76c27-2e96-46ad-a648-705b85b28bd6\") " pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.231290 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6blb4\" (UniqueName: \"kubernetes.io/projected/56b76c27-2e96-46ad-a648-705b85b28bd6-kube-api-access-6blb4\") pod \"memcached-0\" (UID: \"56b76c27-2e96-46ad-a648-705b85b28bd6\") " pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.231320 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b76c27-2e96-46ad-a648-705b85b28bd6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"56b76c27-2e96-46ad-a648-705b85b28bd6\") " pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.231379 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56b76c27-2e96-46ad-a648-705b85b28bd6-kolla-config\") pod \"memcached-0\" (UID: \"56b76c27-2e96-46ad-a648-705b85b28bd6\") " pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.231410 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b76c27-2e96-46ad-a648-705b85b28bd6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"56b76c27-2e96-46ad-a648-705b85b28bd6\") " pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.232829 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56b76c27-2e96-46ad-a648-705b85b28bd6-config-data\") pod \"memcached-0\" (UID: \"56b76c27-2e96-46ad-a648-705b85b28bd6\") " pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.234385 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56b76c27-2e96-46ad-a648-705b85b28bd6-kolla-config\") pod \"memcached-0\" (UID: \"56b76c27-2e96-46ad-a648-705b85b28bd6\") " pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.235375 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b76c27-2e96-46ad-a648-705b85b28bd6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"56b76c27-2e96-46ad-a648-705b85b28bd6\") " pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.239249 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56b76c27-2e96-46ad-a648-705b85b28bd6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"56b76c27-2e96-46ad-a648-705b85b28bd6\") " pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.252156 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6blb4\" (UniqueName: \"kubernetes.io/projected/56b76c27-2e96-46ad-a648-705b85b28bd6-kube-api-access-6blb4\") pod \"memcached-0\" (UID: \"56b76c27-2e96-46ad-a648-705b85b28bd6\") " pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: I0318 12:32:03.416052 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 12:32:03 crc kubenswrapper[4975]: W0318 12:32:03.991021 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e52b66d_395e_4753_9c2b_b271c2dedf36.slice/crio-98943a6bf34a1e9e517258d7ab9b3136e549a4f973ad1c2442c28fec96876439 WatchSource:0}: Error finding container 98943a6bf34a1e9e517258d7ab9b3136e549a4f973ad1c2442c28fec96876439: Status 404 returned error can't find the container with id 98943a6bf34a1e9e517258d7ab9b3136e549a4f973ad1c2442c28fec96876439 Mar 18 12:32:04 crc kubenswrapper[4975]: I0318 12:32:04.749436 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" event={"ID":"2e52b66d-395e-4753-9c2b-b271c2dedf36","Type":"ContainerStarted","Data":"98943a6bf34a1e9e517258d7ab9b3136e549a4f973ad1c2442c28fec96876439"} Mar 18 12:32:05 crc kubenswrapper[4975]: I0318 12:32:05.461044 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:32:05 crc kubenswrapper[4975]: I0318 12:32:05.461952 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:32:05 crc kubenswrapper[4975]: I0318 12:32:05.463949 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-ht7ch" Mar 18 12:32:05 crc kubenswrapper[4975]: I0318 12:32:05.482612 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:32:05 crc kubenswrapper[4975]: I0318 12:32:05.568283 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlrkn\" (UniqueName: \"kubernetes.io/projected/1e74832f-4c39-440a-b1b7-d87c9916fc94-kube-api-access-hlrkn\") pod \"kube-state-metrics-0\" (UID: \"1e74832f-4c39-440a-b1b7-d87c9916fc94\") " pod="openstack/kube-state-metrics-0" Mar 18 12:32:05 crc kubenswrapper[4975]: I0318 12:32:05.670418 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlrkn\" (UniqueName: \"kubernetes.io/projected/1e74832f-4c39-440a-b1b7-d87c9916fc94-kube-api-access-hlrkn\") pod \"kube-state-metrics-0\" (UID: \"1e74832f-4c39-440a-b1b7-d87c9916fc94\") " pod="openstack/kube-state-metrics-0" Mar 18 12:32:05 crc kubenswrapper[4975]: I0318 12:32:05.689517 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlrkn\" (UniqueName: \"kubernetes.io/projected/1e74832f-4c39-440a-b1b7-d87c9916fc94-kube-api-access-hlrkn\") pod \"kube-state-metrics-0\" (UID: \"1e74832f-4c39-440a-b1b7-d87c9916fc94\") " pod="openstack/kube-state-metrics-0" Mar 18 12:32:05 crc kubenswrapper[4975]: I0318 12:32:05.785904 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:32:07 crc kubenswrapper[4975]: I0318 12:32:07.577580 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.107214 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zgkvs"] Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.108585 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.112359 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zgkvs"] Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.112780 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.114362 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.121592 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-72rqg" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.132221 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-qpqb7"] Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.134447 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.155127 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qpqb7"] Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.212399 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3553d46-cacf-43e1-886a-44c17ed9a6c5-var-run-ovn\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.212459 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f3553d46-cacf-43e1-886a-44c17ed9a6c5-var-log-ovn\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.212489 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3553d46-cacf-43e1-886a-44c17ed9a6c5-scripts\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.212561 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp7qs\" (UniqueName: \"kubernetes.io/projected/f3553d46-cacf-43e1-886a-44c17ed9a6c5-kube-api-access-sp7qs\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.212701 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3553d46-cacf-43e1-886a-44c17ed9a6c5-ovn-controller-tls-certs\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.212801 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f3553d46-cacf-43e1-886a-44c17ed9a6c5-var-run\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.212904 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3553d46-cacf-43e1-886a-44c17ed9a6c5-combined-ca-bundle\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.315254 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3553d46-cacf-43e1-886a-44c17ed9a6c5-combined-ca-bundle\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.315985 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df6xn\" (UniqueName: \"kubernetes.io/projected/bc8e272b-99c9-460e-9b72-a031478baf07-kube-api-access-df6xn\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.316020 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc8e272b-99c9-460e-9b72-a031478baf07-scripts\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.316100 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3553d46-cacf-43e1-886a-44c17ed9a6c5-var-run-ovn\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.316120 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bc8e272b-99c9-460e-9b72-a031478baf07-etc-ovs\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.316149 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f3553d46-cacf-43e1-886a-44c17ed9a6c5-var-log-ovn\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.316169 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3553d46-cacf-43e1-886a-44c17ed9a6c5-scripts\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.316206 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bc8e272b-99c9-460e-9b72-a031478baf07-var-lib\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.316248 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bc8e272b-99c9-460e-9b72-a031478baf07-var-run\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.316279 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp7qs\" (UniqueName: \"kubernetes.io/projected/f3553d46-cacf-43e1-886a-44c17ed9a6c5-kube-api-access-sp7qs\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.316306 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bc8e272b-99c9-460e-9b72-a031478baf07-var-log\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.316326 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3553d46-cacf-43e1-886a-44c17ed9a6c5-ovn-controller-tls-certs\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.316371 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f3553d46-cacf-43e1-886a-44c17ed9a6c5-var-run\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.319095 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f3553d46-cacf-43e1-886a-44c17ed9a6c5-var-run\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.319248 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f3553d46-cacf-43e1-886a-44c17ed9a6c5-var-log-ovn\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.319412 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3553d46-cacf-43e1-886a-44c17ed9a6c5-var-run-ovn\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.323100 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3553d46-cacf-43e1-886a-44c17ed9a6c5-scripts\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.324808 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3553d46-cacf-43e1-886a-44c17ed9a6c5-ovn-controller-tls-certs\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.339425 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3553d46-cacf-43e1-886a-44c17ed9a6c5-combined-ca-bundle\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.346819 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp7qs\" (UniqueName: \"kubernetes.io/projected/f3553d46-cacf-43e1-886a-44c17ed9a6c5-kube-api-access-sp7qs\") pod \"ovn-controller-zgkvs\" (UID: \"f3553d46-cacf-43e1-886a-44c17ed9a6c5\") " pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.417532 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df6xn\" (UniqueName: \"kubernetes.io/projected/bc8e272b-99c9-460e-9b72-a031478baf07-kube-api-access-df6xn\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.417597 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc8e272b-99c9-460e-9b72-a031478baf07-scripts\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.417640 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bc8e272b-99c9-460e-9b72-a031478baf07-etc-ovs\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.417684 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bc8e272b-99c9-460e-9b72-a031478baf07-var-lib\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.417721 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bc8e272b-99c9-460e-9b72-a031478baf07-var-run\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.417757 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bc8e272b-99c9-460e-9b72-a031478baf07-var-log\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.418463 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bc8e272b-99c9-460e-9b72-a031478baf07-var-run\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.418535 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bc8e272b-99c9-460e-9b72-a031478baf07-var-log\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.418611 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bc8e272b-99c9-460e-9b72-a031478baf07-var-lib\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.418840 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bc8e272b-99c9-460e-9b72-a031478baf07-etc-ovs\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.422499 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc8e272b-99c9-460e-9b72-a031478baf07-scripts\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.438270 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df6xn\" (UniqueName: \"kubernetes.io/projected/bc8e272b-99c9-460e-9b72-a031478baf07-kube-api-access-df6xn\") pod \"ovn-controller-ovs-qpqb7\" (UID: \"bc8e272b-99c9-460e-9b72-a031478baf07\") " pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.456744 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:08 crc kubenswrapper[4975]: I0318 12:32:08.473295 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:09 crc kubenswrapper[4975]: I0318 12:32:09.952994 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 12:32:09 crc kubenswrapper[4975]: I0318 12:32:09.955299 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:09 crc kubenswrapper[4975]: I0318 12:32:09.960071 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 12:32:09 crc kubenswrapper[4975]: I0318 12:32:09.960446 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 12:32:09 crc kubenswrapper[4975]: I0318 12:32:09.960586 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-jv2gb" Mar 18 12:32:09 crc kubenswrapper[4975]: I0318 12:32:09.960857 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 12:32:09 crc kubenswrapper[4975]: I0318 12:32:09.960985 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.011219 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.059771 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f6a63f7-645b-44df-aea0-787e5596aecc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.059840 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f6a63f7-645b-44df-aea0-787e5596aecc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.059988 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndvf7\" (UniqueName: \"kubernetes.io/projected/9f6a63f7-645b-44df-aea0-787e5596aecc-kube-api-access-ndvf7\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.060044 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.060130 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6a63f7-645b-44df-aea0-787e5596aecc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.060180 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6a63f7-645b-44df-aea0-787e5596aecc-config\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.060208 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f6a63f7-645b-44df-aea0-787e5596aecc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.060238 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9f6a63f7-645b-44df-aea0-787e5596aecc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.161535 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndvf7\" (UniqueName: \"kubernetes.io/projected/9f6a63f7-645b-44df-aea0-787e5596aecc-kube-api-access-ndvf7\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.161584 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.161611 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6a63f7-645b-44df-aea0-787e5596aecc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.161640 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6a63f7-645b-44df-aea0-787e5596aecc-config\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.161658 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f6a63f7-645b-44df-aea0-787e5596aecc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.161677 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9f6a63f7-645b-44df-aea0-787e5596aecc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.161721 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f6a63f7-645b-44df-aea0-787e5596aecc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.161754 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f6a63f7-645b-44df-aea0-787e5596aecc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.161912 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.164679 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9f6a63f7-645b-44df-aea0-787e5596aecc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.165546 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6a63f7-645b-44df-aea0-787e5596aecc-config\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.165587 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f6a63f7-645b-44df-aea0-787e5596aecc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.167877 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f6a63f7-645b-44df-aea0-787e5596aecc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.171702 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f6a63f7-645b-44df-aea0-787e5596aecc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.172596 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6a63f7-645b-44df-aea0-787e5596aecc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.177472 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndvf7\" (UniqueName: \"kubernetes.io/projected/9f6a63f7-645b-44df-aea0-787e5596aecc-kube-api-access-ndvf7\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.186573 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9f6a63f7-645b-44df-aea0-787e5596aecc\") " pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:10 crc kubenswrapper[4975]: I0318 12:32:10.347685 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.335580 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.337701 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.344103 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.345178 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-k7m7b" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.345292 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.345412 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.345568 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.408295 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.408361 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvqsl\" (UniqueName: \"kubernetes.io/projected/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-kube-api-access-tvqsl\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.408385 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-config\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.408408 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.408429 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.408444 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.408521 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.408541 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.510279 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvqsl\" (UniqueName: \"kubernetes.io/projected/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-kube-api-access-tvqsl\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.510334 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-config\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.510365 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.510393 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.510997 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.511097 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.511120 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.511183 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.513861 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.515670 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.515955 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-config\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.516114 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.522190 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.532299 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.532728 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.536189 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvqsl\" (UniqueName: \"kubernetes.io/projected/dd63df8b-b3ee-49dc-b36b-157bb71ac6d5-kube-api-access-tvqsl\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.539209 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5\") " pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:12 crc kubenswrapper[4975]: I0318 12:32:12.657252 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:14 crc kubenswrapper[4975]: W0318 12:32:14.133022 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda68f98b5_0226_4b20_a767_ead5e0af066e.slice/crio-8ee5cab4f90f50e8148cb16309270981684ed5a8a7d49698c77cf7a6b2fc321a WatchSource:0}: Error finding container 8ee5cab4f90f50e8148cb16309270981684ed5a8a7d49698c77cf7a6b2fc321a: Status 404 returned error can't find the container with id 8ee5cab4f90f50e8148cb16309270981684ed5a8a7d49698c77cf7a6b2fc321a Mar 18 12:32:14 crc kubenswrapper[4975]: I0318 12:32:14.538294 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563952-vwzs4"] Mar 18 12:32:14 crc kubenswrapper[4975]: I0318 12:32:14.835986 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a68f98b5-0226-4b20-a767-ead5e0af066e","Type":"ContainerStarted","Data":"8ee5cab4f90f50e8148cb16309270981684ed5a8a7d49698c77cf7a6b2fc321a"} Mar 18 12:32:15 crc kubenswrapper[4975]: E0318 12:32:15.068805 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 12:32:15 crc kubenswrapper[4975]: E0318 12:32:15.069050 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hrdzs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-dm8xw_openstack(e0c86030-ed6d-4757-90ab-0716d93d4513): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:32:15 crc kubenswrapper[4975]: E0318 12:32:15.070278 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" podUID="e0c86030-ed6d-4757-90ab-0716d93d4513" Mar 18 12:32:15 crc kubenswrapper[4975]: E0318 12:32:15.202156 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 18 12:32:15 crc kubenswrapper[4975]: E0318 12:32:15.202768 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6vgmp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-km58k_openstack(cea02ea0-10d7-4b65-8fe5-64efd8c625c9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:32:15 crc kubenswrapper[4975]: E0318 12:32:15.203934 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-km58k" podUID="cea02ea0-10d7-4b65-8fe5-64efd8c625c9" Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.599193 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.637562 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.649824 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.666313 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 12:32:15 crc kubenswrapper[4975]: W0318 12:32:15.667032 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcac25a5_0552_488f_b468_ffea7a442115.slice/crio-367828d6f753e052d5c92082c3c1f4fa42d4db65c39acb33b5f16e95cfbb3a09 WatchSource:0}: Error finding container 367828d6f753e052d5c92082c3c1f4fa42d4db65c39acb33b5f16e95cfbb3a09: Status 404 returned error can't find the container with id 367828d6f753e052d5c92082c3c1f4fa42d4db65c39acb33b5f16e95cfbb3a09 Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.865565 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.876397 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zgkvs"] Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.880419 4975 generic.go:334] "Generic (PLEG): container finished" podID="490b6ddf-67eb-4b14-bce3-c07798f6b03a" containerID="0b32e2615590f4f17db637d217e7b6fcce20aacab2abe286a54b55bcacf10aff" exitCode=0 Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.880504 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" event={"ID":"490b6ddf-67eb-4b14-bce3-c07798f6b03a","Type":"ContainerDied","Data":"0b32e2615590f4f17db637d217e7b6fcce20aacab2abe286a54b55bcacf10aff"} Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.882749 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563952-vwzs4" event={"ID":"23a3a6b1-d7fd-4d05-a075-19aa9585b87d","Type":"ContainerStarted","Data":"eb58713a97cf0f5486ea88e448b595a2a1a3e0970224a18135c5774685f4b9ea"} Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.884515 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"52cabd28-7357-4e96-b812-637660ce5cd1","Type":"ContainerStarted","Data":"b775a92982ad37055d1155641d195055b4c4a5c0d5384e10a50c7cf7c623c89b"} Mar 18 12:32:15 crc kubenswrapper[4975]: W0318 12:32:15.884962 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3553d46_cacf_43e1_886a_44c17ed9a6c5.slice/crio-2b12efe9041fd03df645c1cc4bd689f631b645304f9dab92e2f0de7bf74004e2 WatchSource:0}: Error finding container 2b12efe9041fd03df645c1cc4bd689f631b645304f9dab92e2f0de7bf74004e2: Status 404 returned error can't find the container with id 2b12efe9041fd03df645c1cc4bd689f631b645304f9dab92e2f0de7bf74004e2 Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.885748 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"56b76c27-2e96-46ad-a648-705b85b28bd6","Type":"ContainerStarted","Data":"1a31d472f8fe42720b9d65f922c346b53f80f7a10c8e5cbbdaf8f20d458f0bec"} Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.889733 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bcac25a5-0552-488f-b468-ffea7a442115","Type":"ContainerStarted","Data":"367828d6f753e052d5c92082c3c1f4fa42d4db65c39acb33b5f16e95cfbb3a09"} Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.891204 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1e74832f-4c39-440a-b1b7-d87c9916fc94","Type":"ContainerStarted","Data":"3d7975d9fbcc401f22a0d0e8599d4c1696617cca354b768070ff49ec880caaf1"} Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.894351 4975 generic.go:334] "Generic (PLEG): container finished" podID="2e52b66d-395e-4753-9c2b-b271c2dedf36" containerID="bcf917c68bbda5cfa57843e462a3059dc2e5571b4c1df2edad7eea6f95e6fb34" exitCode=0 Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.895291 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" event={"ID":"2e52b66d-395e-4753-9c2b-b271c2dedf36","Type":"ContainerDied","Data":"bcf917c68bbda5cfa57843e462a3059dc2e5571b4c1df2edad7eea6f95e6fb34"} Mar 18 12:32:15 crc kubenswrapper[4975]: I0318 12:32:15.992683 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qpqb7"] Mar 18 12:32:16 crc kubenswrapper[4975]: W0318 12:32:16.012977 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc8e272b_99c9_460e_9b72_a031478baf07.slice/crio-f322dc9f9b8678c09dab993fa53352fcf0d13759a47d51ffec9bccd735692587 WatchSource:0}: Error finding container f322dc9f9b8678c09dab993fa53352fcf0d13759a47d51ffec9bccd735692587: Status 404 returned error can't find the container with id f322dc9f9b8678c09dab993fa53352fcf0d13759a47d51ffec9bccd735692587 Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.114977 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 12:32:16 crc kubenswrapper[4975]: E0318 12:32:16.137597 4975 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 18 12:32:16 crc kubenswrapper[4975]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/490b6ddf-67eb-4b14-bce3-c07798f6b03a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 18 12:32:16 crc kubenswrapper[4975]: > podSandboxID="f062420ec67dbf9958a699aea1597b7116418333d969725d004cc7332e6322c8" Mar 18 12:32:16 crc kubenswrapper[4975]: E0318 12:32:16.146251 4975 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 12:32:16 crc kubenswrapper[4975]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ws99j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-xmlfk_openstack(490b6ddf-67eb-4b14-bce3-c07798f6b03a): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/490b6ddf-67eb-4b14-bce3-c07798f6b03a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 18 12:32:16 crc kubenswrapper[4975]: > logger="UnhandledError" Mar 18 12:32:16 crc kubenswrapper[4975]: E0318 12:32:16.147561 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/490b6ddf-67eb-4b14-bce3-c07798f6b03a/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" podUID="490b6ddf-67eb-4b14-bce3-c07798f6b03a" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.412438 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.425807 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-km58k" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.509708 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrdzs\" (UniqueName: \"kubernetes.io/projected/e0c86030-ed6d-4757-90ab-0716d93d4513-kube-api-access-hrdzs\") pod \"e0c86030-ed6d-4757-90ab-0716d93d4513\" (UID: \"e0c86030-ed6d-4757-90ab-0716d93d4513\") " Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.509807 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0c86030-ed6d-4757-90ab-0716d93d4513-dns-svc\") pod \"e0c86030-ed6d-4757-90ab-0716d93d4513\" (UID: \"e0c86030-ed6d-4757-90ab-0716d93d4513\") " Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.509928 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0c86030-ed6d-4757-90ab-0716d93d4513-config\") pod \"e0c86030-ed6d-4757-90ab-0716d93d4513\" (UID: \"e0c86030-ed6d-4757-90ab-0716d93d4513\") " Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.510427 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c86030-ed6d-4757-90ab-0716d93d4513-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e0c86030-ed6d-4757-90ab-0716d93d4513" (UID: "e0c86030-ed6d-4757-90ab-0716d93d4513"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.510603 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c86030-ed6d-4757-90ab-0716d93d4513-config" (OuterVolumeSpecName: "config") pod "e0c86030-ed6d-4757-90ab-0716d93d4513" (UID: "e0c86030-ed6d-4757-90ab-0716d93d4513"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.510695 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vgmp\" (UniqueName: \"kubernetes.io/projected/cea02ea0-10d7-4b65-8fe5-64efd8c625c9-kube-api-access-6vgmp\") pod \"cea02ea0-10d7-4b65-8fe5-64efd8c625c9\" (UID: \"cea02ea0-10d7-4b65-8fe5-64efd8c625c9\") " Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.510801 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cea02ea0-10d7-4b65-8fe5-64efd8c625c9-config\") pod \"cea02ea0-10d7-4b65-8fe5-64efd8c625c9\" (UID: \"cea02ea0-10d7-4b65-8fe5-64efd8c625c9\") " Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.511514 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cea02ea0-10d7-4b65-8fe5-64efd8c625c9-config" (OuterVolumeSpecName: "config") pod "cea02ea0-10d7-4b65-8fe5-64efd8c625c9" (UID: "cea02ea0-10d7-4b65-8fe5-64efd8c625c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.512339 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0c86030-ed6d-4757-90ab-0716d93d4513-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.512367 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0c86030-ed6d-4757-90ab-0716d93d4513-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.512381 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cea02ea0-10d7-4b65-8fe5-64efd8c625c9-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.514982 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea02ea0-10d7-4b65-8fe5-64efd8c625c9-kube-api-access-6vgmp" (OuterVolumeSpecName: "kube-api-access-6vgmp") pod "cea02ea0-10d7-4b65-8fe5-64efd8c625c9" (UID: "cea02ea0-10d7-4b65-8fe5-64efd8c625c9"). InnerVolumeSpecName "kube-api-access-6vgmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.515648 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0c86030-ed6d-4757-90ab-0716d93d4513-kube-api-access-hrdzs" (OuterVolumeSpecName: "kube-api-access-hrdzs") pod "e0c86030-ed6d-4757-90ab-0716d93d4513" (UID: "e0c86030-ed6d-4757-90ab-0716d93d4513"). InnerVolumeSpecName "kube-api-access-hrdzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.614270 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrdzs\" (UniqueName: \"kubernetes.io/projected/e0c86030-ed6d-4757-90ab-0716d93d4513-kube-api-access-hrdzs\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.614306 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vgmp\" (UniqueName: \"kubernetes.io/projected/cea02ea0-10d7-4b65-8fe5-64efd8c625c9-kube-api-access-6vgmp\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.703398 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 12:32:16 crc kubenswrapper[4975]: W0318 12:32:16.833223 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd63df8b_b3ee_49dc_b36b_157bb71ac6d5.slice/crio-5ec93018b661a344d6c2d250a80282395d8c1bc8ab9b78e1c91c1b10de70fac9 WatchSource:0}: Error finding container 5ec93018b661a344d6c2d250a80282395d8c1bc8ab9b78e1c91c1b10de70fac9: Status 404 returned error can't find the container with id 5ec93018b661a344d6c2d250a80282395d8c1bc8ab9b78e1c91c1b10de70fac9 Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.907627 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-km58k" event={"ID":"cea02ea0-10d7-4b65-8fe5-64efd8c625c9","Type":"ContainerDied","Data":"712966b1b6f9235ac5504851d9c0683c15a0adbd4a8ed3382d6b888c25fe7b5d"} Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.907707 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-km58k" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.910102 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5","Type":"ContainerStarted","Data":"5ec93018b661a344d6c2d250a80282395d8c1bc8ab9b78e1c91c1b10de70fac9"} Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.912221 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" event={"ID":"2e52b66d-395e-4753-9c2b-b271c2dedf36","Type":"ContainerStarted","Data":"e3b35411eef529a5c14514cc07f0f7048c30c4a62a7ee80164fa29051cf3639d"} Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.912899 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.913784 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9f6a63f7-645b-44df-aea0-787e5596aecc","Type":"ContainerStarted","Data":"20fb4c53f1e229dbb17d2f787d35843bf767aee1f59a57ba30d24913dd3eb563"} Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.914918 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qpqb7" event={"ID":"bc8e272b-99c9-460e-9b72-a031478baf07","Type":"ContainerStarted","Data":"f322dc9f9b8678c09dab993fa53352fcf0d13759a47d51ffec9bccd735692587"} Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.917714 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" event={"ID":"e0c86030-ed6d-4757-90ab-0716d93d4513","Type":"ContainerDied","Data":"c8966b5c0726f5f199f2656508cc67fcf7b9478f295688c74981891e4b17570a"} Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.917776 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dm8xw" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.931402 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5","Type":"ContainerStarted","Data":"5e5f7b4ae1b56fb6fd09f9d532cbfe95e79fa5c50ac23f84a724403ed2e0d0fe"} Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.932636 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zgkvs" event={"ID":"f3553d46-cacf-43e1-886a-44c17ed9a6c5","Type":"ContainerStarted","Data":"2b12efe9041fd03df645c1cc4bd689f631b645304f9dab92e2f0de7bf74004e2"} Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.936489 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" podStartSLOduration=6.708904812 podStartE2EDuration="17.936466791s" podCreationTimestamp="2026-03-18 12:31:59 +0000 UTC" firstStartedPulling="2026-03-18 12:32:03.992447663 +0000 UTC m=+1309.706848242" lastFinishedPulling="2026-03-18 12:32:15.220009642 +0000 UTC m=+1320.934410221" observedRunningTime="2026-03-18 12:32:16.930444056 +0000 UTC m=+1322.644844635" watchObservedRunningTime="2026-03-18 12:32:16.936466791 +0000 UTC m=+1322.650867370" Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.947381 4975 generic.go:334] "Generic (PLEG): container finished" podID="23a3a6b1-d7fd-4d05-a075-19aa9585b87d" containerID="7092497515fb9f1f8f2eacfb88960fe8f9b45454769a5e47d8591f9e8531e8f6" exitCode=0 Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.947482 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563952-vwzs4" event={"ID":"23a3a6b1-d7fd-4d05-a075-19aa9585b87d","Type":"ContainerDied","Data":"7092497515fb9f1f8f2eacfb88960fe8f9b45454769a5e47d8591f9e8531e8f6"} Mar 18 12:32:16 crc kubenswrapper[4975]: I0318 12:32:16.975318 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-km58k"] Mar 18 12:32:17 crc kubenswrapper[4975]: I0318 12:32:17.000680 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-km58k"] Mar 18 12:32:17 crc kubenswrapper[4975]: I0318 12:32:17.072485 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea02ea0-10d7-4b65-8fe5-64efd8c625c9" path="/var/lib/kubelet/pods/cea02ea0-10d7-4b65-8fe5-64efd8c625c9/volumes" Mar 18 12:32:17 crc kubenswrapper[4975]: I0318 12:32:17.183025 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dm8xw"] Mar 18 12:32:17 crc kubenswrapper[4975]: I0318 12:32:17.205854 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dm8xw"] Mar 18 12:32:19 crc kubenswrapper[4975]: I0318 12:32:19.026491 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0c86030-ed6d-4757-90ab-0716d93d4513" path="/var/lib/kubelet/pods/e0c86030-ed6d-4757-90ab-0716d93d4513/volumes" Mar 18 12:32:19 crc kubenswrapper[4975]: I0318 12:32:19.590594 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563952-vwzs4" Mar 18 12:32:19 crc kubenswrapper[4975]: I0318 12:32:19.701052 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xl66\" (UniqueName: \"kubernetes.io/projected/23a3a6b1-d7fd-4d05-a075-19aa9585b87d-kube-api-access-7xl66\") pod \"23a3a6b1-d7fd-4d05-a075-19aa9585b87d\" (UID: \"23a3a6b1-d7fd-4d05-a075-19aa9585b87d\") " Mar 18 12:32:19 crc kubenswrapper[4975]: I0318 12:32:19.706483 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a3a6b1-d7fd-4d05-a075-19aa9585b87d-kube-api-access-7xl66" (OuterVolumeSpecName: "kube-api-access-7xl66") pod "23a3a6b1-d7fd-4d05-a075-19aa9585b87d" (UID: "23a3a6b1-d7fd-4d05-a075-19aa9585b87d"). InnerVolumeSpecName "kube-api-access-7xl66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:19 crc kubenswrapper[4975]: I0318 12:32:19.802918 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xl66\" (UniqueName: \"kubernetes.io/projected/23a3a6b1-d7fd-4d05-a075-19aa9585b87d-kube-api-access-7xl66\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:19 crc kubenswrapper[4975]: I0318 12:32:19.970254 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563952-vwzs4" event={"ID":"23a3a6b1-d7fd-4d05-a075-19aa9585b87d","Type":"ContainerDied","Data":"eb58713a97cf0f5486ea88e448b595a2a1a3e0970224a18135c5774685f4b9ea"} Mar 18 12:32:19 crc kubenswrapper[4975]: I0318 12:32:19.970302 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb58713a97cf0f5486ea88e448b595a2a1a3e0970224a18135c5774685f4b9ea" Mar 18 12:32:19 crc kubenswrapper[4975]: I0318 12:32:19.970364 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563952-vwzs4" Mar 18 12:32:20 crc kubenswrapper[4975]: I0318 12:32:20.880639 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563946-p6cjf"] Mar 18 12:32:20 crc kubenswrapper[4975]: I0318 12:32:20.918184 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563946-p6cjf"] Mar 18 12:32:21 crc kubenswrapper[4975]: I0318 12:32:21.025993 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20335645-e96e-4255-b3c9-1f7da0921398" path="/var/lib/kubelet/pods/20335645-e96e-4255-b3c9-1f7da0921398/volumes" Mar 18 12:32:24 crc kubenswrapper[4975]: I0318 12:32:24.637150 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" Mar 18 12:32:24 crc kubenswrapper[4975]: I0318 12:32:24.701124 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-xmlfk"] Mar 18 12:32:25 crc kubenswrapper[4975]: I0318 12:32:25.538735 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:32:25 crc kubenswrapper[4975]: I0318 12:32:25.539110 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:32:27 crc kubenswrapper[4975]: I0318 12:32:27.028409 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" podUID="490b6ddf-67eb-4b14-bce3-c07798f6b03a" containerName="dnsmasq-dns" containerID="cri-o://5c9df58bc49e7ac0c069a78fa7e45dea3374781954110622bf8d3cb7c914b966" gracePeriod=10 Mar 18 12:32:27 crc kubenswrapper[4975]: I0318 12:32:27.031097 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" Mar 18 12:32:27 crc kubenswrapper[4975]: I0318 12:32:27.031211 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" event={"ID":"490b6ddf-67eb-4b14-bce3-c07798f6b03a","Type":"ContainerStarted","Data":"5c9df58bc49e7ac0c069a78fa7e45dea3374781954110622bf8d3cb7c914b966"} Mar 18 12:32:27 crc kubenswrapper[4975]: I0318 12:32:27.059443 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" podStartSLOduration=13.486305813 podStartE2EDuration="29.059419267s" podCreationTimestamp="2026-03-18 12:31:58 +0000 UTC" firstStartedPulling="2026-03-18 12:31:59.673714542 +0000 UTC m=+1305.388115121" lastFinishedPulling="2026-03-18 12:32:15.246827996 +0000 UTC m=+1320.961228575" observedRunningTime="2026-03-18 12:32:27.051185652 +0000 UTC m=+1332.765586251" watchObservedRunningTime="2026-03-18 12:32:27.059419267 +0000 UTC m=+1332.773819866" Mar 18 12:32:27 crc kubenswrapper[4975]: I0318 12:32:27.431708 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" Mar 18 12:32:27 crc kubenswrapper[4975]: I0318 12:32:27.537459 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/490b6ddf-67eb-4b14-bce3-c07798f6b03a-dns-svc\") pod \"490b6ddf-67eb-4b14-bce3-c07798f6b03a\" (UID: \"490b6ddf-67eb-4b14-bce3-c07798f6b03a\") " Mar 18 12:32:27 crc kubenswrapper[4975]: I0318 12:32:27.538163 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490b6ddf-67eb-4b14-bce3-c07798f6b03a-config\") pod \"490b6ddf-67eb-4b14-bce3-c07798f6b03a\" (UID: \"490b6ddf-67eb-4b14-bce3-c07798f6b03a\") " Mar 18 12:32:27 crc kubenswrapper[4975]: I0318 12:32:27.538266 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws99j\" (UniqueName: \"kubernetes.io/projected/490b6ddf-67eb-4b14-bce3-c07798f6b03a-kube-api-access-ws99j\") pod \"490b6ddf-67eb-4b14-bce3-c07798f6b03a\" (UID: \"490b6ddf-67eb-4b14-bce3-c07798f6b03a\") " Mar 18 12:32:27 crc kubenswrapper[4975]: I0318 12:32:27.561897 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490b6ddf-67eb-4b14-bce3-c07798f6b03a-kube-api-access-ws99j" (OuterVolumeSpecName: "kube-api-access-ws99j") pod "490b6ddf-67eb-4b14-bce3-c07798f6b03a" (UID: "490b6ddf-67eb-4b14-bce3-c07798f6b03a"). InnerVolumeSpecName "kube-api-access-ws99j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:27 crc kubenswrapper[4975]: I0318 12:32:27.639829 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws99j\" (UniqueName: \"kubernetes.io/projected/490b6ddf-67eb-4b14-bce3-c07798f6b03a-kube-api-access-ws99j\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:27 crc kubenswrapper[4975]: I0318 12:32:27.979617 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/490b6ddf-67eb-4b14-bce3-c07798f6b03a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "490b6ddf-67eb-4b14-bce3-c07798f6b03a" (UID: "490b6ddf-67eb-4b14-bce3-c07798f6b03a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:27 crc kubenswrapper[4975]: I0318 12:32:27.995052 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/490b6ddf-67eb-4b14-bce3-c07798f6b03a-config" (OuterVolumeSpecName: "config") pod "490b6ddf-67eb-4b14-bce3-c07798f6b03a" (UID: "490b6ddf-67eb-4b14-bce3-c07798f6b03a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.040023 4975 generic.go:334] "Generic (PLEG): container finished" podID="490b6ddf-67eb-4b14-bce3-c07798f6b03a" containerID="5c9df58bc49e7ac0c069a78fa7e45dea3374781954110622bf8d3cb7c914b966" exitCode=0 Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.040117 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.041262 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" event={"ID":"490b6ddf-67eb-4b14-bce3-c07798f6b03a","Type":"ContainerDied","Data":"5c9df58bc49e7ac0c069a78fa7e45dea3374781954110622bf8d3cb7c914b966"} Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.041294 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-xmlfk" event={"ID":"490b6ddf-67eb-4b14-bce3-c07798f6b03a","Type":"ContainerDied","Data":"f062420ec67dbf9958a699aea1597b7116418333d969725d004cc7332e6322c8"} Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.041312 4975 scope.go:117] "RemoveContainer" containerID="5c9df58bc49e7ac0c069a78fa7e45dea3374781954110622bf8d3cb7c914b966" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.045253 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"52cabd28-7357-4e96-b812-637660ce5cd1","Type":"ContainerStarted","Data":"b076c598d954e515a8982df1f567681a61864029346b4bc8e9f619e4d43ed67d"} Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.045843 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/490b6ddf-67eb-4b14-bce3-c07798f6b03a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.046063 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490b6ddf-67eb-4b14-bce3-c07798f6b03a-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.048043 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"56b76c27-2e96-46ad-a648-705b85b28bd6","Type":"ContainerStarted","Data":"e24b5ecce0e8d6db26cce3988b70b75af1330352222ea389cb1136f56c9fa9bd"} Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.048943 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.050756 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1e74832f-4c39-440a-b1b7-d87c9916fc94","Type":"ContainerStarted","Data":"1c1030fbb596f29bc3231abc7eb7114a80dd1ef168931bd4854f91a4c9cf5a6b"} Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.051003 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.053266 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5","Type":"ContainerStarted","Data":"b71b7b7f61c6d28e5ba7e2a2bcad4d7cb5ff0eeec83344a5306cb9477b98e3dc"} Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.056124 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9f6a63f7-645b-44df-aea0-787e5596aecc","Type":"ContainerStarted","Data":"1bfeeea7675e314d76e2f6e22e46913e1b67efaf0f395f83e8449c97861bcf2a"} Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.058211 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qpqb7" event={"ID":"bc8e272b-99c9-460e-9b72-a031478baf07","Type":"ContainerStarted","Data":"486641807f86cafffe5924d9b217aea7c6a4dd44d3807a0ac664a35a6153ac30"} Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.066951 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5","Type":"ContainerStarted","Data":"593d8e21a04405a96f7601c6d9e04bc7ca7920d74e754470dec6f28b650c85f9"} Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.069597 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a68f98b5-0226-4b20-a767-ead5e0af066e","Type":"ContainerStarted","Data":"f117753601c23acff04c84a90faed4c89539cd6b2706e3826b8124d9fc1ce0a0"} Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.077135 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bcac25a5-0552-488f-b468-ffea7a442115","Type":"ContainerStarted","Data":"469dd5f93f76ac44ef46173fd56a7c5296f4c3b62fa4f4b7eb3ae330faa769e0"} Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.081985 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zgkvs" event={"ID":"f3553d46-cacf-43e1-886a-44c17ed9a6c5","Type":"ContainerStarted","Data":"9218cad62dbf60c5c84ad35c59ca7691d2ce146a2cc094e162373186644d3667"} Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.089037 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-zgkvs" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.100647 4975 scope.go:117] "RemoveContainer" containerID="0b32e2615590f4f17db637d217e7b6fcce20aacab2abe286a54b55bcacf10aff" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.140841 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.259630536 podStartE2EDuration="25.140813835s" podCreationTimestamp="2026-03-18 12:32:03 +0000 UTC" firstStartedPulling="2026-03-18 12:32:15.639292468 +0000 UTC m=+1321.353693057" lastFinishedPulling="2026-03-18 12:32:25.520475777 +0000 UTC m=+1331.234876356" observedRunningTime="2026-03-18 12:32:28.116791007 +0000 UTC m=+1333.831191596" watchObservedRunningTime="2026-03-18 12:32:28.140813835 +0000 UTC m=+1333.855214414" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.157772 4975 scope.go:117] "RemoveContainer" containerID="5c9df58bc49e7ac0c069a78fa7e45dea3374781954110622bf8d3cb7c914b966" Mar 18 12:32:28 crc kubenswrapper[4975]: E0318 12:32:28.163520 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c9df58bc49e7ac0c069a78fa7e45dea3374781954110622bf8d3cb7c914b966\": container with ID starting with 5c9df58bc49e7ac0c069a78fa7e45dea3374781954110622bf8d3cb7c914b966 not found: ID does not exist" containerID="5c9df58bc49e7ac0c069a78fa7e45dea3374781954110622bf8d3cb7c914b966" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.163631 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c9df58bc49e7ac0c069a78fa7e45dea3374781954110622bf8d3cb7c914b966"} err="failed to get container status \"5c9df58bc49e7ac0c069a78fa7e45dea3374781954110622bf8d3cb7c914b966\": rpc error: code = NotFound desc = could not find container \"5c9df58bc49e7ac0c069a78fa7e45dea3374781954110622bf8d3cb7c914b966\": container with ID starting with 5c9df58bc49e7ac0c069a78fa7e45dea3374781954110622bf8d3cb7c914b966 not found: ID does not exist" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.163661 4975 scope.go:117] "RemoveContainer" containerID="0b32e2615590f4f17db637d217e7b6fcce20aacab2abe286a54b55bcacf10aff" Mar 18 12:32:28 crc kubenswrapper[4975]: E0318 12:32:28.164033 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b32e2615590f4f17db637d217e7b6fcce20aacab2abe286a54b55bcacf10aff\": container with ID starting with 0b32e2615590f4f17db637d217e7b6fcce20aacab2abe286a54b55bcacf10aff not found: ID does not exist" containerID="0b32e2615590f4f17db637d217e7b6fcce20aacab2abe286a54b55bcacf10aff" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.164061 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b32e2615590f4f17db637d217e7b6fcce20aacab2abe286a54b55bcacf10aff"} err="failed to get container status \"0b32e2615590f4f17db637d217e7b6fcce20aacab2abe286a54b55bcacf10aff\": rpc error: code = NotFound desc = could not find container \"0b32e2615590f4f17db637d217e7b6fcce20aacab2abe286a54b55bcacf10aff\": container with ID starting with 0b32e2615590f4f17db637d217e7b6fcce20aacab2abe286a54b55bcacf10aff not found: ID does not exist" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.168944 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=12.247259657 podStartE2EDuration="23.168919954s" podCreationTimestamp="2026-03-18 12:32:05 +0000 UTC" firstStartedPulling="2026-03-18 12:32:15.650289129 +0000 UTC m=+1321.364689708" lastFinishedPulling="2026-03-18 12:32:26.571949426 +0000 UTC m=+1332.286350005" observedRunningTime="2026-03-18 12:32:28.145517703 +0000 UTC m=+1333.859918292" watchObservedRunningTime="2026-03-18 12:32:28.168919954 +0000 UTC m=+1333.883320533" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.202000 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zgkvs" podStartSLOduration=10.57404091 podStartE2EDuration="20.201978318s" podCreationTimestamp="2026-03-18 12:32:08 +0000 UTC" firstStartedPulling="2026-03-18 12:32:15.89257821 +0000 UTC m=+1321.606978789" lastFinishedPulling="2026-03-18 12:32:25.520515618 +0000 UTC m=+1331.234916197" observedRunningTime="2026-03-18 12:32:28.194338359 +0000 UTC m=+1333.908738958" watchObservedRunningTime="2026-03-18 12:32:28.201978318 +0000 UTC m=+1333.916378897" Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.346980 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-xmlfk"] Mar 18 12:32:28 crc kubenswrapper[4975]: I0318 12:32:28.353933 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-xmlfk"] Mar 18 12:32:29 crc kubenswrapper[4975]: I0318 12:32:29.027025 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="490b6ddf-67eb-4b14-bce3-c07798f6b03a" path="/var/lib/kubelet/pods/490b6ddf-67eb-4b14-bce3-c07798f6b03a/volumes" Mar 18 12:32:29 crc kubenswrapper[4975]: I0318 12:32:29.098735 4975 generic.go:334] "Generic (PLEG): container finished" podID="bc8e272b-99c9-460e-9b72-a031478baf07" containerID="486641807f86cafffe5924d9b217aea7c6a4dd44d3807a0ac664a35a6153ac30" exitCode=0 Mar 18 12:32:29 crc kubenswrapper[4975]: I0318 12:32:29.100649 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qpqb7" event={"ID":"bc8e272b-99c9-460e-9b72-a031478baf07","Type":"ContainerDied","Data":"486641807f86cafffe5924d9b217aea7c6a4dd44d3807a0ac664a35a6153ac30"} Mar 18 12:32:29 crc kubenswrapper[4975]: I0318 12:32:29.926985 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-dtpb5"] Mar 18 12:32:29 crc kubenswrapper[4975]: E0318 12:32:29.927370 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a3a6b1-d7fd-4d05-a075-19aa9585b87d" containerName="oc" Mar 18 12:32:29 crc kubenswrapper[4975]: I0318 12:32:29.927391 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a3a6b1-d7fd-4d05-a075-19aa9585b87d" containerName="oc" Mar 18 12:32:29 crc kubenswrapper[4975]: E0318 12:32:29.927410 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490b6ddf-67eb-4b14-bce3-c07798f6b03a" containerName="dnsmasq-dns" Mar 18 12:32:29 crc kubenswrapper[4975]: I0318 12:32:29.927419 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="490b6ddf-67eb-4b14-bce3-c07798f6b03a" containerName="dnsmasq-dns" Mar 18 12:32:29 crc kubenswrapper[4975]: E0318 12:32:29.927438 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490b6ddf-67eb-4b14-bce3-c07798f6b03a" containerName="init" Mar 18 12:32:29 crc kubenswrapper[4975]: I0318 12:32:29.927446 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="490b6ddf-67eb-4b14-bce3-c07798f6b03a" containerName="init" Mar 18 12:32:29 crc kubenswrapper[4975]: I0318 12:32:29.927638 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="490b6ddf-67eb-4b14-bce3-c07798f6b03a" containerName="dnsmasq-dns" Mar 18 12:32:29 crc kubenswrapper[4975]: I0318 12:32:29.927662 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a3a6b1-d7fd-4d05-a075-19aa9585b87d" containerName="oc" Mar 18 12:32:29 crc kubenswrapper[4975]: I0318 12:32:29.928305 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:29 crc kubenswrapper[4975]: I0318 12:32:29.932379 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 12:32:29 crc kubenswrapper[4975]: I0318 12:32:29.958731 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-dtpb5"] Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.084920 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ce2e6c-7802-4318-89e7-f9f40bd5369f-combined-ca-bundle\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.084972 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghqrg\" (UniqueName: \"kubernetes.io/projected/25ce2e6c-7802-4318-89e7-f9f40bd5369f-kube-api-access-ghqrg\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.085027 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ce2e6c-7802-4318-89e7-f9f40bd5369f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.085047 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/25ce2e6c-7802-4318-89e7-f9f40bd5369f-ovs-rundir\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.085102 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ce2e6c-7802-4318-89e7-f9f40bd5369f-config\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.085124 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/25ce2e6c-7802-4318-89e7-f9f40bd5369f-ovn-rundir\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.113977 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-8wkkx"] Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.115245 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.117945 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.122741 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-8wkkx"] Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.186153 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ce2e6c-7802-4318-89e7-f9f40bd5369f-combined-ca-bundle\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.187285 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghqrg\" (UniqueName: \"kubernetes.io/projected/25ce2e6c-7802-4318-89e7-f9f40bd5369f-kube-api-access-ghqrg\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.187350 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ce2e6c-7802-4318-89e7-f9f40bd5369f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.187370 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/25ce2e6c-7802-4318-89e7-f9f40bd5369f-ovs-rundir\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.187433 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ce2e6c-7802-4318-89e7-f9f40bd5369f-config\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.187452 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/25ce2e6c-7802-4318-89e7-f9f40bd5369f-ovn-rundir\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.187643 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/25ce2e6c-7802-4318-89e7-f9f40bd5369f-ovn-rundir\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.188299 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ce2e6c-7802-4318-89e7-f9f40bd5369f-config\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.188436 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/25ce2e6c-7802-4318-89e7-f9f40bd5369f-ovs-rundir\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.195454 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ce2e6c-7802-4318-89e7-f9f40bd5369f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.195540 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ce2e6c-7802-4318-89e7-f9f40bd5369f-combined-ca-bundle\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.204447 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghqrg\" (UniqueName: \"kubernetes.io/projected/25ce2e6c-7802-4318-89e7-f9f40bd5369f-kube-api-access-ghqrg\") pod \"ovn-controller-metrics-dtpb5\" (UID: \"25ce2e6c-7802-4318-89e7-f9f40bd5369f\") " pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.248493 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-dtpb5" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.289517 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-8wkkx\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.289608 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dt9f\" (UniqueName: \"kubernetes.io/projected/3095d0a9-61b7-4c7b-9873-d89fb199265e-kube-api-access-7dt9f\") pod \"dnsmasq-dns-7fd796d7df-8wkkx\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.289683 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-8wkkx\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.290053 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-config\") pod \"dnsmasq-dns-7fd796d7df-8wkkx\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.303188 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-8wkkx"] Mar 18 12:32:30 crc kubenswrapper[4975]: E0318 12:32:30.303768 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-7dt9f ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" podUID="3095d0a9-61b7-4c7b-9873-d89fb199265e" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.388647 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-jlxjq"] Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.390520 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.392921 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.392991 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-8wkkx\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.393041 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dt9f\" (UniqueName: \"kubernetes.io/projected/3095d0a9-61b7-4c7b-9873-d89fb199265e-kube-api-access-7dt9f\") pod \"dnsmasq-dns-7fd796d7df-8wkkx\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.393075 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-8wkkx\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.393131 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-config\") pod \"dnsmasq-dns-7fd796d7df-8wkkx\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.394199 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-8wkkx\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.395027 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-8wkkx\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.395095 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-config\") pod \"dnsmasq-dns-7fd796d7df-8wkkx\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.400754 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-jlxjq"] Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.413405 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dt9f\" (UniqueName: \"kubernetes.io/projected/3095d0a9-61b7-4c7b-9873-d89fb199265e-kube-api-access-7dt9f\") pod \"dnsmasq-dns-7fd796d7df-8wkkx\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.494090 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-config\") pod \"dnsmasq-dns-86db49b7ff-jlxjq\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.494547 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4bbc\" (UniqueName: \"kubernetes.io/projected/ba758ff3-5254-4e8b-97c8-e73570be9078-kube-api-access-t4bbc\") pod \"dnsmasq-dns-86db49b7ff-jlxjq\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.494576 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-jlxjq\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.494628 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-jlxjq\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.494670 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-jlxjq\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.596087 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4bbc\" (UniqueName: \"kubernetes.io/projected/ba758ff3-5254-4e8b-97c8-e73570be9078-kube-api-access-t4bbc\") pod \"dnsmasq-dns-86db49b7ff-jlxjq\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.596142 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-jlxjq\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.596166 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-jlxjq\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.596191 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-jlxjq\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.596306 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-config\") pod \"dnsmasq-dns-86db49b7ff-jlxjq\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.597271 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-jlxjq\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.597298 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-jlxjq\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.597312 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-jlxjq\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.597392 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-config\") pod \"dnsmasq-dns-86db49b7ff-jlxjq\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.617319 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4bbc\" (UniqueName: \"kubernetes.io/projected/ba758ff3-5254-4e8b-97c8-e73570be9078-kube-api-access-t4bbc\") pod \"dnsmasq-dns-86db49b7ff-jlxjq\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:30 crc kubenswrapper[4975]: I0318 12:32:30.745562 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.127587 4975 generic.go:334] "Generic (PLEG): container finished" podID="bcac25a5-0552-488f-b468-ffea7a442115" containerID="469dd5f93f76ac44ef46173fd56a7c5296f4c3b62fa4f4b7eb3ae330faa769e0" exitCode=0 Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.127855 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bcac25a5-0552-488f-b468-ffea7a442115","Type":"ContainerDied","Data":"469dd5f93f76ac44ef46173fd56a7c5296f4c3b62fa4f4b7eb3ae330faa769e0"} Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.135028 4975 generic.go:334] "Generic (PLEG): container finished" podID="52cabd28-7357-4e96-b812-637660ce5cd1" containerID="b076c598d954e515a8982df1f567681a61864029346b4bc8e9f619e4d43ed67d" exitCode=0 Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.135122 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.135614 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"52cabd28-7357-4e96-b812-637660ce5cd1","Type":"ContainerDied","Data":"b076c598d954e515a8982df1f567681a61864029346b4bc8e9f619e4d43ed67d"} Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.167100 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.307349 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-dns-svc\") pod \"3095d0a9-61b7-4c7b-9873-d89fb199265e\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.307659 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-ovsdbserver-nb\") pod \"3095d0a9-61b7-4c7b-9873-d89fb199265e\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.307737 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-config\") pod \"3095d0a9-61b7-4c7b-9873-d89fb199265e\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.307756 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dt9f\" (UniqueName: \"kubernetes.io/projected/3095d0a9-61b7-4c7b-9873-d89fb199265e-kube-api-access-7dt9f\") pod \"3095d0a9-61b7-4c7b-9873-d89fb199265e\" (UID: \"3095d0a9-61b7-4c7b-9873-d89fb199265e\") " Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.308573 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3095d0a9-61b7-4c7b-9873-d89fb199265e" (UID: "3095d0a9-61b7-4c7b-9873-d89fb199265e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.309248 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-config" (OuterVolumeSpecName: "config") pod "3095d0a9-61b7-4c7b-9873-d89fb199265e" (UID: "3095d0a9-61b7-4c7b-9873-d89fb199265e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.309734 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3095d0a9-61b7-4c7b-9873-d89fb199265e" (UID: "3095d0a9-61b7-4c7b-9873-d89fb199265e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.312811 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3095d0a9-61b7-4c7b-9873-d89fb199265e-kube-api-access-7dt9f" (OuterVolumeSpecName: "kube-api-access-7dt9f") pod "3095d0a9-61b7-4c7b-9873-d89fb199265e" (UID: "3095d0a9-61b7-4c7b-9873-d89fb199265e"). InnerVolumeSpecName "kube-api-access-7dt9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.410932 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.410980 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dt9f\" (UniqueName: \"kubernetes.io/projected/3095d0a9-61b7-4c7b-9873-d89fb199265e-kube-api-access-7dt9f\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.410997 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.411007 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3095d0a9-61b7-4c7b-9873-d89fb199265e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.529357 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-jlxjq"] Mar 18 12:32:31 crc kubenswrapper[4975]: I0318 12:32:31.535734 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-dtpb5"] Mar 18 12:32:31 crc kubenswrapper[4975]: W0318 12:32:31.546458 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba758ff3_5254_4e8b_97c8_e73570be9078.slice/crio-37311b0d26be11ae275b89527b1c0a25f7a6c34fdd7d5c1c50436289c668708f WatchSource:0}: Error finding container 37311b0d26be11ae275b89527b1c0a25f7a6c34fdd7d5c1c50436289c668708f: Status 404 returned error can't find the container with id 37311b0d26be11ae275b89527b1c0a25f7a6c34fdd7d5c1c50436289c668708f Mar 18 12:32:31 crc kubenswrapper[4975]: W0318 12:32:31.556725 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25ce2e6c_7802_4318_89e7_f9f40bd5369f.slice/crio-59de9c499c4846426bc0354753423a03651dde45196c77109595b91dd5ef1b67 WatchSource:0}: Error finding container 59de9c499c4846426bc0354753423a03651dde45196c77109595b91dd5ef1b67: Status 404 returned error can't find the container with id 59de9c499c4846426bc0354753423a03651dde45196c77109595b91dd5ef1b67 Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.144466 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bcac25a5-0552-488f-b468-ffea7a442115","Type":"ContainerStarted","Data":"5426890aca59a7a7868985e2093a56087268e2e8dc8afa55fa8b5cf0e5749b87"} Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.146105 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dd63df8b-b3ee-49dc-b36b-157bb71ac6d5","Type":"ContainerStarted","Data":"76e5a19bc6589a9459a86bb1d08ae173f62b8d76e45265796afa49f2c433cbe8"} Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.147830 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9f6a63f7-645b-44df-aea0-787e5596aecc","Type":"ContainerStarted","Data":"8c0f255a926104cc953bfd2f2424381b44e4c1327ed2982cafd4959e3ad60be4"} Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.150160 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qpqb7" event={"ID":"bc8e272b-99c9-460e-9b72-a031478baf07","Type":"ContainerStarted","Data":"c9c350dfb325b66afae41c189a690247332a3d6f7907dd80b1dcea6c71290f8f"} Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.150200 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qpqb7" event={"ID":"bc8e272b-99c9-460e-9b72-a031478baf07","Type":"ContainerStarted","Data":"f3b7ddbd230fe884c35ded1d264765aaf1d0304e06c55baef2758c391c95a785"} Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.150304 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.150329 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.151573 4975 generic.go:334] "Generic (PLEG): container finished" podID="ba758ff3-5254-4e8b-97c8-e73570be9078" containerID="fc83145ec1c95ab153c89ec55b35c4b52316c65fb199d979e488af706faf6a7a" exitCode=0 Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.151667 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" event={"ID":"ba758ff3-5254-4e8b-97c8-e73570be9078","Type":"ContainerDied","Data":"fc83145ec1c95ab153c89ec55b35c4b52316c65fb199d979e488af706faf6a7a"} Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.151915 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" event={"ID":"ba758ff3-5254-4e8b-97c8-e73570be9078","Type":"ContainerStarted","Data":"37311b0d26be11ae275b89527b1c0a25f7a6c34fdd7d5c1c50436289c668708f"} Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.153413 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-dtpb5" event={"ID":"25ce2e6c-7802-4318-89e7-f9f40bd5369f","Type":"ContainerStarted","Data":"36cb559ae54dd73eb665fd1d357e624e9f1d98b3189b933d2506713bc7b5ef70"} Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.153471 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-dtpb5" event={"ID":"25ce2e6c-7802-4318-89e7-f9f40bd5369f","Type":"ContainerStarted","Data":"59de9c499c4846426bc0354753423a03651dde45196c77109595b91dd5ef1b67"} Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.166469 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"52cabd28-7357-4e96-b812-637660ce5cd1","Type":"ContainerStarted","Data":"1bdadfd3dd69524a7d705fac2d6585726f1af801cebc763916c48c5a6aadb5ab"} Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.166619 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-8wkkx" Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.183601 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.336357091 podStartE2EDuration="32.183577401s" podCreationTimestamp="2026-03-18 12:32:00 +0000 UTC" firstStartedPulling="2026-03-18 12:32:15.669388011 +0000 UTC m=+1321.383788590" lastFinishedPulling="2026-03-18 12:32:25.516608311 +0000 UTC m=+1331.231008900" observedRunningTime="2026-03-18 12:32:32.176686452 +0000 UTC m=+1337.891087031" watchObservedRunningTime="2026-03-18 12:32:32.183577401 +0000 UTC m=+1337.897977980" Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.208363 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-dtpb5" podStartSLOduration=3.208340549 podStartE2EDuration="3.208340549s" podCreationTimestamp="2026-03-18 12:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:32.19962334 +0000 UTC m=+1337.914023919" watchObservedRunningTime="2026-03-18 12:32:32.208340549 +0000 UTC m=+1337.922741128" Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.239539 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-qpqb7" podStartSLOduration=13.877508632 podStartE2EDuration="24.239522202s" podCreationTimestamp="2026-03-18 12:32:08 +0000 UTC" firstStartedPulling="2026-03-18 12:32:16.037237089 +0000 UTC m=+1321.751637668" lastFinishedPulling="2026-03-18 12:32:26.399250659 +0000 UTC m=+1332.113651238" observedRunningTime="2026-03-18 12:32:32.23505888 +0000 UTC m=+1337.949459479" watchObservedRunningTime="2026-03-18 12:32:32.239522202 +0000 UTC m=+1337.953922781" Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.269014 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=9.333333602 podStartE2EDuration="24.268988299s" podCreationTimestamp="2026-03-18 12:32:08 +0000 UTC" firstStartedPulling="2026-03-18 12:32:16.182893026 +0000 UTC m=+1321.897293605" lastFinishedPulling="2026-03-18 12:32:31.118547723 +0000 UTC m=+1336.832948302" observedRunningTime="2026-03-18 12:32:32.255816158 +0000 UTC m=+1337.970216737" watchObservedRunningTime="2026-03-18 12:32:32.268988299 +0000 UTC m=+1337.983388878" Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.327568 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.05706683 podStartE2EDuration="21.327547521s" podCreationTimestamp="2026-03-18 12:32:11 +0000 UTC" firstStartedPulling="2026-03-18 12:32:16.83558577 +0000 UTC m=+1322.549986349" lastFinishedPulling="2026-03-18 12:32:31.106066451 +0000 UTC m=+1336.820467040" observedRunningTime="2026-03-18 12:32:32.316077348 +0000 UTC m=+1338.030477957" watchObservedRunningTime="2026-03-18 12:32:32.327547521 +0000 UTC m=+1338.041948110" Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.334066 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=21.222997869 podStartE2EDuration="31.334039819s" podCreationTimestamp="2026-03-18 12:32:01 +0000 UTC" firstStartedPulling="2026-03-18 12:32:15.618574521 +0000 UTC m=+1321.332975100" lastFinishedPulling="2026-03-18 12:32:25.729616471 +0000 UTC m=+1331.444017050" observedRunningTime="2026-03-18 12:32:32.290365704 +0000 UTC m=+1338.004766283" watchObservedRunningTime="2026-03-18 12:32:32.334039819 +0000 UTC m=+1338.048440408" Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.384982 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-8wkkx"] Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.386350 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-8wkkx"] Mar 18 12:32:32 crc kubenswrapper[4975]: I0318 12:32:32.657853 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:33 crc kubenswrapper[4975]: I0318 12:32:33.035347 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3095d0a9-61b7-4c7b-9873-d89fb199265e" path="/var/lib/kubelet/pods/3095d0a9-61b7-4c7b-9873-d89fb199265e/volumes" Mar 18 12:32:33 crc kubenswrapper[4975]: I0318 12:32:33.177251 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:33 crc kubenswrapper[4975]: I0318 12:32:33.177548 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:33 crc kubenswrapper[4975]: I0318 12:32:33.232584 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" event={"ID":"ba758ff3-5254-4e8b-97c8-e73570be9078","Type":"ContainerStarted","Data":"cdf1b7568a709edd3a0e9a43affb9ef2cc5091e16a58264e3cca8d135ff5b746"} Mar 18 12:32:33 crc kubenswrapper[4975]: I0318 12:32:33.250093 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" podStartSLOduration=3.25006917 podStartE2EDuration="3.25006917s" podCreationTimestamp="2026-03-18 12:32:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:33.248154508 +0000 UTC m=+1338.962555087" watchObservedRunningTime="2026-03-18 12:32:33.25006917 +0000 UTC m=+1338.964469759" Mar 18 12:32:33 crc kubenswrapper[4975]: I0318 12:32:33.417429 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 12:32:33 crc kubenswrapper[4975]: I0318 12:32:33.658370 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:33 crc kubenswrapper[4975]: I0318 12:32:33.710020 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:33 crc kubenswrapper[4975]: I0318 12:32:33.995085 4975 scope.go:117] "RemoveContainer" containerID="ec430e8f9e4bb7f1a402c32a2ed89e973adb7d2b9982d7b8a9483181a25360ca" Mar 18 12:32:34 crc kubenswrapper[4975]: I0318 12:32:34.238971 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:34 crc kubenswrapper[4975]: I0318 12:32:34.281420 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 12:32:34 crc kubenswrapper[4975]: I0318 12:32:34.350256 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:34 crc kubenswrapper[4975]: I0318 12:32:34.423221 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.246496 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.288612 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.480775 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.482305 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.484525 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.485534 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.485737 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.486791 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-xb97w" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.500150 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.562094 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30f134c7-8393-40ca-8c2b-1070ea5ec68c-scripts\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.562154 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/30f134c7-8393-40ca-8c2b-1070ea5ec68c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.562223 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30f134c7-8393-40ca-8c2b-1070ea5ec68c-config\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.562346 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqwfx\" (UniqueName: \"kubernetes.io/projected/30f134c7-8393-40ca-8c2b-1070ea5ec68c-kube-api-access-fqwfx\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.562425 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f134c7-8393-40ca-8c2b-1070ea5ec68c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.562472 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f134c7-8393-40ca-8c2b-1070ea5ec68c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.562518 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f134c7-8393-40ca-8c2b-1070ea5ec68c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.664370 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30f134c7-8393-40ca-8c2b-1070ea5ec68c-scripts\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.664435 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/30f134c7-8393-40ca-8c2b-1070ea5ec68c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.664474 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30f134c7-8393-40ca-8c2b-1070ea5ec68c-config\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.664511 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqwfx\" (UniqueName: \"kubernetes.io/projected/30f134c7-8393-40ca-8c2b-1070ea5ec68c-kube-api-access-fqwfx\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.664535 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f134c7-8393-40ca-8c2b-1070ea5ec68c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.664558 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f134c7-8393-40ca-8c2b-1070ea5ec68c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.664593 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f134c7-8393-40ca-8c2b-1070ea5ec68c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.665025 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/30f134c7-8393-40ca-8c2b-1070ea5ec68c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.665305 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30f134c7-8393-40ca-8c2b-1070ea5ec68c-scripts\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.665456 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30f134c7-8393-40ca-8c2b-1070ea5ec68c-config\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.670975 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f134c7-8393-40ca-8c2b-1070ea5ec68c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.671263 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f134c7-8393-40ca-8c2b-1070ea5ec68c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.674628 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f134c7-8393-40ca-8c2b-1070ea5ec68c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.685593 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqwfx\" (UniqueName: \"kubernetes.io/projected/30f134c7-8393-40ca-8c2b-1070ea5ec68c-kube-api-access-fqwfx\") pod \"ovn-northd-0\" (UID: \"30f134c7-8393-40ca-8c2b-1070ea5ec68c\") " pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.823297 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.825563 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.951836 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-jlxjq"] Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.989807 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-c56q4"] Mar 18 12:32:35 crc kubenswrapper[4975]: I0318 12:32:35.991704 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.010272 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-c56q4"] Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.071248 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-config\") pod \"dnsmasq-dns-698758b865-c56q4\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.071336 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-dns-svc\") pod \"dnsmasq-dns-698758b865-c56q4\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.071355 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvspw\" (UniqueName: \"kubernetes.io/projected/337ec784-956c-49f3-a71b-93bdd815f447-kube-api-access-dvspw\") pod \"dnsmasq-dns-698758b865-c56q4\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.071418 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-c56q4\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.071623 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-c56q4\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.173600 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-c56q4\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.173666 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-config\") pod \"dnsmasq-dns-698758b865-c56q4\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.173733 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-dns-svc\") pod \"dnsmasq-dns-698758b865-c56q4\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.173749 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvspw\" (UniqueName: \"kubernetes.io/projected/337ec784-956c-49f3-a71b-93bdd815f447-kube-api-access-dvspw\") pod \"dnsmasq-dns-698758b865-c56q4\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.173767 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-c56q4\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.174820 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-config\") pod \"dnsmasq-dns-698758b865-c56q4\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.175113 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-dns-svc\") pod \"dnsmasq-dns-698758b865-c56q4\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.178116 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-c56q4\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.178479 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-c56q4\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.201129 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvspw\" (UniqueName: \"kubernetes.io/projected/337ec784-956c-49f3-a71b-93bdd815f447-kube-api-access-dvspw\") pod \"dnsmasq-dns-698758b865-c56q4\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.254192 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" podUID="ba758ff3-5254-4e8b-97c8-e73570be9078" containerName="dnsmasq-dns" containerID="cri-o://cdf1b7568a709edd3a0e9a43affb9ef2cc5091e16a58264e3cca8d135ff5b746" gracePeriod=10 Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.327761 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.414023 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 12:32:36 crc kubenswrapper[4975]: W0318 12:32:36.842017 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod337ec784_956c_49f3_a71b_93bdd815f447.slice/crio-7e1bfd11542b73478044fdc17964d40bbadd30900e3c372c7a520e8168853c8e WatchSource:0}: Error finding container 7e1bfd11542b73478044fdc17964d40bbadd30900e3c372c7a520e8168853c8e: Status 404 returned error can't find the container with id 7e1bfd11542b73478044fdc17964d40bbadd30900e3c372c7a520e8168853c8e Mar 18 12:32:36 crc kubenswrapper[4975]: I0318 12:32:36.858091 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-c56q4"] Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.030555 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.049019 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.052376 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-s6t55" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.052602 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.052696 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.052769 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.052977 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.054083 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.196829 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-dns-svc\") pod \"ba758ff3-5254-4e8b-97c8-e73570be9078\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.197327 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-config\") pod \"ba758ff3-5254-4e8b-97c8-e73570be9078\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.197435 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-ovsdbserver-nb\") pod \"ba758ff3-5254-4e8b-97c8-e73570be9078\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.197491 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4bbc\" (UniqueName: \"kubernetes.io/projected/ba758ff3-5254-4e8b-97c8-e73570be9078-kube-api-access-t4bbc\") pod \"ba758ff3-5254-4e8b-97c8-e73570be9078\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.197547 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-ovsdbserver-sb\") pod \"ba758ff3-5254-4e8b-97c8-e73570be9078\" (UID: \"ba758ff3-5254-4e8b-97c8-e73570be9078\") " Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.197803 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-cache\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.197857 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.197993 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.198034 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9ftl\" (UniqueName: \"kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-kube-api-access-d9ftl\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.198088 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-lock\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.198178 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.210072 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba758ff3-5254-4e8b-97c8-e73570be9078-kube-api-access-t4bbc" (OuterVolumeSpecName: "kube-api-access-t4bbc") pod "ba758ff3-5254-4e8b-97c8-e73570be9078" (UID: "ba758ff3-5254-4e8b-97c8-e73570be9078"). InnerVolumeSpecName "kube-api-access-t4bbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.259344 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-config" (OuterVolumeSpecName: "config") pod "ba758ff3-5254-4e8b-97c8-e73570be9078" (UID: "ba758ff3-5254-4e8b-97c8-e73570be9078"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.266324 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ba758ff3-5254-4e8b-97c8-e73570be9078" (UID: "ba758ff3-5254-4e8b-97c8-e73570be9078"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.266371 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba758ff3-5254-4e8b-97c8-e73570be9078" (UID: "ba758ff3-5254-4e8b-97c8-e73570be9078"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.266778 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba758ff3-5254-4e8b-97c8-e73570be9078" (UID: "ba758ff3-5254-4e8b-97c8-e73570be9078"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.269997 4975 generic.go:334] "Generic (PLEG): container finished" podID="337ec784-956c-49f3-a71b-93bdd815f447" containerID="1717431bdef2e1ed9d2e923c72aac56021c20bbab74d29fb46a0a02ed944a7ca" exitCode=0 Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.270069 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-c56q4" event={"ID":"337ec784-956c-49f3-a71b-93bdd815f447","Type":"ContainerDied","Data":"1717431bdef2e1ed9d2e923c72aac56021c20bbab74d29fb46a0a02ed944a7ca"} Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.270096 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-c56q4" event={"ID":"337ec784-956c-49f3-a71b-93bdd815f447","Type":"ContainerStarted","Data":"7e1bfd11542b73478044fdc17964d40bbadd30900e3c372c7a520e8168853c8e"} Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.272395 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"30f134c7-8393-40ca-8c2b-1070ea5ec68c","Type":"ContainerStarted","Data":"c8fe44e359c94642737ec1412f5493f188e24bede995c82209b6828d4c015e57"} Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.275740 4975 generic.go:334] "Generic (PLEG): container finished" podID="ba758ff3-5254-4e8b-97c8-e73570be9078" containerID="cdf1b7568a709edd3a0e9a43affb9ef2cc5091e16a58264e3cca8d135ff5b746" exitCode=0 Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.276788 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.277728 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" event={"ID":"ba758ff3-5254-4e8b-97c8-e73570be9078","Type":"ContainerDied","Data":"cdf1b7568a709edd3a0e9a43affb9ef2cc5091e16a58264e3cca8d135ff5b746"} Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.277786 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-jlxjq" event={"ID":"ba758ff3-5254-4e8b-97c8-e73570be9078","Type":"ContainerDied","Data":"37311b0d26be11ae275b89527b1c0a25f7a6c34fdd7d5c1c50436289c668708f"} Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.277805 4975 scope.go:117] "RemoveContainer" containerID="cdf1b7568a709edd3a0e9a43affb9ef2cc5091e16a58264e3cca8d135ff5b746" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.299765 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.299827 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-cache\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.299852 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.299903 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.299934 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9ftl\" (UniqueName: \"kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-kube-api-access-d9ftl\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: E0318 12:32:37.299955 4975 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:32:37 crc kubenswrapper[4975]: E0318 12:32:37.299982 4975 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:32:37 crc kubenswrapper[4975]: E0318 12:32:37.300037 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift podName:18526bd6-7184-4e92-8bb6-f85ec1aa3f30 nodeName:}" failed. No retries permitted until 2026-03-18 12:32:37.800018864 +0000 UTC m=+1343.514419443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift") pod "swift-storage-0" (UID: "18526bd6-7184-4e92-8bb6-f85ec1aa3f30") : configmap "swift-ring-files" not found Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.300295 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-cache\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.300535 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.303174 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-lock\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.303317 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.303335 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.303347 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.303359 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4bbc\" (UniqueName: \"kubernetes.io/projected/ba758ff3-5254-4e8b-97c8-e73570be9078-kube-api-access-t4bbc\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.303370 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba758ff3-5254-4e8b-97c8-e73570be9078-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.310263 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-lock\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.315527 4975 scope.go:117] "RemoveContainer" containerID="fc83145ec1c95ab153c89ec55b35c4b52316c65fb199d979e488af706faf6a7a" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.316063 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.322664 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9ftl\" (UniqueName: \"kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-kube-api-access-d9ftl\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.323707 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.329622 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-jlxjq"] Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.339336 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-jlxjq"] Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.352102 4975 scope.go:117] "RemoveContainer" containerID="cdf1b7568a709edd3a0e9a43affb9ef2cc5091e16a58264e3cca8d135ff5b746" Mar 18 12:32:37 crc kubenswrapper[4975]: E0318 12:32:37.352577 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdf1b7568a709edd3a0e9a43affb9ef2cc5091e16a58264e3cca8d135ff5b746\": container with ID starting with cdf1b7568a709edd3a0e9a43affb9ef2cc5091e16a58264e3cca8d135ff5b746 not found: ID does not exist" containerID="cdf1b7568a709edd3a0e9a43affb9ef2cc5091e16a58264e3cca8d135ff5b746" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.352621 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdf1b7568a709edd3a0e9a43affb9ef2cc5091e16a58264e3cca8d135ff5b746"} err="failed to get container status \"cdf1b7568a709edd3a0e9a43affb9ef2cc5091e16a58264e3cca8d135ff5b746\": rpc error: code = NotFound desc = could not find container \"cdf1b7568a709edd3a0e9a43affb9ef2cc5091e16a58264e3cca8d135ff5b746\": container with ID starting with cdf1b7568a709edd3a0e9a43affb9ef2cc5091e16a58264e3cca8d135ff5b746 not found: ID does not exist" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.352646 4975 scope.go:117] "RemoveContainer" containerID="fc83145ec1c95ab153c89ec55b35c4b52316c65fb199d979e488af706faf6a7a" Mar 18 12:32:37 crc kubenswrapper[4975]: E0318 12:32:37.353563 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc83145ec1c95ab153c89ec55b35c4b52316c65fb199d979e488af706faf6a7a\": container with ID starting with fc83145ec1c95ab153c89ec55b35c4b52316c65fb199d979e488af706faf6a7a not found: ID does not exist" containerID="fc83145ec1c95ab153c89ec55b35c4b52316c65fb199d979e488af706faf6a7a" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.353586 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc83145ec1c95ab153c89ec55b35c4b52316c65fb199d979e488af706faf6a7a"} err="failed to get container status \"fc83145ec1c95ab153c89ec55b35c4b52316c65fb199d979e488af706faf6a7a\": rpc error: code = NotFound desc = could not find container \"fc83145ec1c95ab153c89ec55b35c4b52316c65fb199d979e488af706faf6a7a\": container with ID starting with fc83145ec1c95ab153c89ec55b35c4b52316c65fb199d979e488af706faf6a7a not found: ID does not exist" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.587989 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-nbnw7"] Mar 18 12:32:37 crc kubenswrapper[4975]: E0318 12:32:37.588307 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba758ff3-5254-4e8b-97c8-e73570be9078" containerName="dnsmasq-dns" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.588320 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba758ff3-5254-4e8b-97c8-e73570be9078" containerName="dnsmasq-dns" Mar 18 12:32:37 crc kubenswrapper[4975]: E0318 12:32:37.588332 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba758ff3-5254-4e8b-97c8-e73570be9078" containerName="init" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.588337 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba758ff3-5254-4e8b-97c8-e73570be9078" containerName="init" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.588479 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba758ff3-5254-4e8b-97c8-e73570be9078" containerName="dnsmasq-dns" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.589158 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.591705 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.591911 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.592596 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.607114 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nbnw7"] Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.634437 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-nbnw7"] Mar 18 12:32:37 crc kubenswrapper[4975]: E0318 12:32:37.635110 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-jqv7x ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-nbnw7" podUID="f632e592-0e17-4d01-8b88-1a4d59e76ebe" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.642986 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-g5bjt"] Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.644324 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.661747 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-g5bjt"] Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.721100 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f632e592-0e17-4d01-8b88-1a4d59e76ebe-scripts\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.721230 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-combined-ca-bundle\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.721316 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f632e592-0e17-4d01-8b88-1a4d59e76ebe-ring-data-devices\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.721393 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqv7x\" (UniqueName: \"kubernetes.io/projected/f632e592-0e17-4d01-8b88-1a4d59e76ebe-kube-api-access-jqv7x\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.721483 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f632e592-0e17-4d01-8b88-1a4d59e76ebe-etc-swift\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.721588 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-swiftconf\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.721625 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-dispersionconf\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.822832 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcm2l\" (UniqueName: \"kubernetes.io/projected/f0f58451-e968-467f-8d95-7a4c5104ce12-kube-api-access-tcm2l\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.822910 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-dispersionconf\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.822938 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-combined-ca-bundle\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.822990 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f632e592-0e17-4d01-8b88-1a4d59e76ebe-scripts\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.823026 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-combined-ca-bundle\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.823054 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f632e592-0e17-4d01-8b88-1a4d59e76ebe-ring-data-devices\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.823081 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-dispersionconf\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.823116 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-swiftconf\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.823142 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.823177 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqv7x\" (UniqueName: \"kubernetes.io/projected/f632e592-0e17-4d01-8b88-1a4d59e76ebe-kube-api-access-jqv7x\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.823234 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f632e592-0e17-4d01-8b88-1a4d59e76ebe-etc-swift\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.823299 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f0f58451-e968-467f-8d95-7a4c5104ce12-etc-swift\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.823330 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f0f58451-e968-467f-8d95-7a4c5104ce12-ring-data-devices\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.823358 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0f58451-e968-467f-8d95-7a4c5104ce12-scripts\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.823396 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-swiftconf\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.825107 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f632e592-0e17-4d01-8b88-1a4d59e76ebe-scripts\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.825348 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f632e592-0e17-4d01-8b88-1a4d59e76ebe-etc-swift\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: E0318 12:32:37.825453 4975 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:32:37 crc kubenswrapper[4975]: E0318 12:32:37.825472 4975 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:32:37 crc kubenswrapper[4975]: E0318 12:32:37.825520 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift podName:18526bd6-7184-4e92-8bb6-f85ec1aa3f30 nodeName:}" failed. No retries permitted until 2026-03-18 12:32:38.825502827 +0000 UTC m=+1344.539903406 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift") pod "swift-storage-0" (UID: "18526bd6-7184-4e92-8bb6-f85ec1aa3f30") : configmap "swift-ring-files" not found Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.826535 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f632e592-0e17-4d01-8b88-1a4d59e76ebe-ring-data-devices\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.830623 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-dispersionconf\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.831332 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-swiftconf\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.833577 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-combined-ca-bundle\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.849034 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqv7x\" (UniqueName: \"kubernetes.io/projected/f632e592-0e17-4d01-8b88-1a4d59e76ebe-kube-api-access-jqv7x\") pod \"swift-ring-rebalance-nbnw7\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.925011 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f0f58451-e968-467f-8d95-7a4c5104ce12-etc-swift\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.925066 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f0f58451-e968-467f-8d95-7a4c5104ce12-ring-data-devices\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.925094 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0f58451-e968-467f-8d95-7a4c5104ce12-scripts\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.925130 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcm2l\" (UniqueName: \"kubernetes.io/projected/f0f58451-e968-467f-8d95-7a4c5104ce12-kube-api-access-tcm2l\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.925167 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-combined-ca-bundle\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.925249 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-dispersionconf\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.925285 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-swiftconf\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.927467 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f0f58451-e968-467f-8d95-7a4c5104ce12-etc-swift\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.927633 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0f58451-e968-467f-8d95-7a4c5104ce12-scripts\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.928111 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f0f58451-e968-467f-8d95-7a4c5104ce12-ring-data-devices\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.929202 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-swiftconf\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.930067 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-combined-ca-bundle\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.936559 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-dispersionconf\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.961229 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcm2l\" (UniqueName: \"kubernetes.io/projected/f0f58451-e968-467f-8d95-7a4c5104ce12-kube-api-access-tcm2l\") pod \"swift-ring-rebalance-g5bjt\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:37 crc kubenswrapper[4975]: I0318 12:32:37.965105 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.285128 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"30f134c7-8393-40ca-8c2b-1070ea5ec68c","Type":"ContainerStarted","Data":"0d24c185ecf74debd2b56405027731dddb907d1f725c7bdb7c8d7b856a9d29ef"} Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.285446 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"30f134c7-8393-40ca-8c2b-1070ea5ec68c","Type":"ContainerStarted","Data":"089697c914b480bf104c431578be66355e319a510c2802b796dee98d38f0452d"} Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.288087 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.289428 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-c56q4" event={"ID":"337ec784-956c-49f3-a71b-93bdd815f447","Type":"ContainerStarted","Data":"2dbd35310d405c295867572cf6ccec20f0019c8163a95de9a70ccdaebb7ab74f"} Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.289467 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.305181 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.309689 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-c56q4" podStartSLOduration=3.309669638 podStartE2EDuration="3.309669638s" podCreationTimestamp="2026-03-18 12:32:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:38.308837465 +0000 UTC m=+1344.023238044" watchObservedRunningTime="2026-03-18 12:32:38.309669638 +0000 UTC m=+1344.024070217" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.363459 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-dispersionconf\") pod \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.363549 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f632e592-0e17-4d01-8b88-1a4d59e76ebe-ring-data-devices\") pod \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.363569 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f632e592-0e17-4d01-8b88-1a4d59e76ebe-scripts\") pod \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.363625 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-swiftconf\") pod \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.363653 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f632e592-0e17-4d01-8b88-1a4d59e76ebe-etc-swift\") pod \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.363729 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-combined-ca-bundle\") pod \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.363812 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqv7x\" (UniqueName: \"kubernetes.io/projected/f632e592-0e17-4d01-8b88-1a4d59e76ebe-kube-api-access-jqv7x\") pod \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\" (UID: \"f632e592-0e17-4d01-8b88-1a4d59e76ebe\") " Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.364135 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f632e592-0e17-4d01-8b88-1a4d59e76ebe-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f632e592-0e17-4d01-8b88-1a4d59e76ebe" (UID: "f632e592-0e17-4d01-8b88-1a4d59e76ebe"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.364197 4975 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f632e592-0e17-4d01-8b88-1a4d59e76ebe-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.364326 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f632e592-0e17-4d01-8b88-1a4d59e76ebe-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f632e592-0e17-4d01-8b88-1a4d59e76ebe" (UID: "f632e592-0e17-4d01-8b88-1a4d59e76ebe"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.364517 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f632e592-0e17-4d01-8b88-1a4d59e76ebe-scripts" (OuterVolumeSpecName: "scripts") pod "f632e592-0e17-4d01-8b88-1a4d59e76ebe" (UID: "f632e592-0e17-4d01-8b88-1a4d59e76ebe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.368116 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f632e592-0e17-4d01-8b88-1a4d59e76ebe-kube-api-access-jqv7x" (OuterVolumeSpecName: "kube-api-access-jqv7x") pod "f632e592-0e17-4d01-8b88-1a4d59e76ebe" (UID: "f632e592-0e17-4d01-8b88-1a4d59e76ebe"). InnerVolumeSpecName "kube-api-access-jqv7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.368120 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f632e592-0e17-4d01-8b88-1a4d59e76ebe" (UID: "f632e592-0e17-4d01-8b88-1a4d59e76ebe"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.368201 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f632e592-0e17-4d01-8b88-1a4d59e76ebe" (UID: "f632e592-0e17-4d01-8b88-1a4d59e76ebe"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.368233 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f632e592-0e17-4d01-8b88-1a4d59e76ebe" (UID: "f632e592-0e17-4d01-8b88-1a4d59e76ebe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:38 crc kubenswrapper[4975]: W0318 12:32:38.416379 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0f58451_e968_467f_8d95_7a4c5104ce12.slice/crio-917625031b795ba1362b3c37aa959d0cf6b5b83587b095218c822ae7b15a0684 WatchSource:0}: Error finding container 917625031b795ba1362b3c37aa959d0cf6b5b83587b095218c822ae7b15a0684: Status 404 returned error can't find the container with id 917625031b795ba1362b3c37aa959d0cf6b5b83587b095218c822ae7b15a0684 Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.417036 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-g5bjt"] Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.465743 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqv7x\" (UniqueName: \"kubernetes.io/projected/f632e592-0e17-4d01-8b88-1a4d59e76ebe-kube-api-access-jqv7x\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.466092 4975 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.466105 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f632e592-0e17-4d01-8b88-1a4d59e76ebe-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.466114 4975 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.466124 4975 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f632e592-0e17-4d01-8b88-1a4d59e76ebe-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.466134 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f632e592-0e17-4d01-8b88-1a4d59e76ebe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:38 crc kubenswrapper[4975]: I0318 12:32:38.872215 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:38 crc kubenswrapper[4975]: E0318 12:32:38.872394 4975 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:32:38 crc kubenswrapper[4975]: E0318 12:32:38.872589 4975 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:32:38 crc kubenswrapper[4975]: E0318 12:32:38.872642 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift podName:18526bd6-7184-4e92-8bb6-f85ec1aa3f30 nodeName:}" failed. No retries permitted until 2026-03-18 12:32:40.872627726 +0000 UTC m=+1346.587028305 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift") pod "swift-storage-0" (UID: "18526bd6-7184-4e92-8bb6-f85ec1aa3f30") : configmap "swift-ring-files" not found Mar 18 12:32:39 crc kubenswrapper[4975]: I0318 12:32:39.025230 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba758ff3-5254-4e8b-97c8-e73570be9078" path="/var/lib/kubelet/pods/ba758ff3-5254-4e8b-97c8-e73570be9078/volumes" Mar 18 12:32:39 crc kubenswrapper[4975]: I0318 12:32:39.298898 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-g5bjt" event={"ID":"f0f58451-e968-467f-8d95-7a4c5104ce12","Type":"ContainerStarted","Data":"917625031b795ba1362b3c37aa959d0cf6b5b83587b095218c822ae7b15a0684"} Mar 18 12:32:39 crc kubenswrapper[4975]: I0318 12:32:39.299671 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nbnw7" Mar 18 12:32:39 crc kubenswrapper[4975]: I0318 12:32:39.300425 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 12:32:39 crc kubenswrapper[4975]: I0318 12:32:39.320124 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.857269145 podStartE2EDuration="4.320103652s" podCreationTimestamp="2026-03-18 12:32:35 +0000 UTC" firstStartedPulling="2026-03-18 12:32:36.435840792 +0000 UTC m=+1342.150241371" lastFinishedPulling="2026-03-18 12:32:37.898675299 +0000 UTC m=+1343.613075878" observedRunningTime="2026-03-18 12:32:39.318512208 +0000 UTC m=+1345.032912797" watchObservedRunningTime="2026-03-18 12:32:39.320103652 +0000 UTC m=+1345.034504231" Mar 18 12:32:39 crc kubenswrapper[4975]: I0318 12:32:39.333954 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:39 crc kubenswrapper[4975]: I0318 12:32:39.351129 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-nbnw7"] Mar 18 12:32:39 crc kubenswrapper[4975]: I0318 12:32:39.356257 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-nbnw7"] Mar 18 12:32:39 crc kubenswrapper[4975]: I0318 12:32:39.411937 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 12:32:40 crc kubenswrapper[4975]: I0318 12:32:40.903634 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:40 crc kubenswrapper[4975]: E0318 12:32:40.903853 4975 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:32:40 crc kubenswrapper[4975]: E0318 12:32:40.904238 4975 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:32:40 crc kubenswrapper[4975]: E0318 12:32:40.904332 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift podName:18526bd6-7184-4e92-8bb6-f85ec1aa3f30 nodeName:}" failed. No retries permitted until 2026-03-18 12:32:44.904307581 +0000 UTC m=+1350.618708160 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift") pod "swift-storage-0" (UID: "18526bd6-7184-4e92-8bb6-f85ec1aa3f30") : configmap "swift-ring-files" not found Mar 18 12:32:41 crc kubenswrapper[4975]: I0318 12:32:41.028351 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f632e592-0e17-4d01-8b88-1a4d59e76ebe" path="/var/lib/kubelet/pods/f632e592-0e17-4d01-8b88-1a4d59e76ebe/volumes" Mar 18 12:32:41 crc kubenswrapper[4975]: I0318 12:32:41.889765 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-fxwlg"] Mar 18 12:32:41 crc kubenswrapper[4975]: I0318 12:32:41.891082 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fxwlg" Mar 18 12:32:41 crc kubenswrapper[4975]: I0318 12:32:41.893294 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 12:32:41 crc kubenswrapper[4975]: I0318 12:32:41.897851 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fxwlg"] Mar 18 12:32:41 crc kubenswrapper[4975]: I0318 12:32:41.902609 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 12:32:41 crc kubenswrapper[4975]: I0318 12:32:41.902676 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 12:32:41 crc kubenswrapper[4975]: I0318 12:32:41.973961 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 12:32:42 crc kubenswrapper[4975]: I0318 12:32:42.020227 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4760d40e-f9c9-4952-9773-3af1740da52a-operator-scripts\") pod \"root-account-create-update-fxwlg\" (UID: \"4760d40e-f9c9-4952-9773-3af1740da52a\") " pod="openstack/root-account-create-update-fxwlg" Mar 18 12:32:42 crc kubenswrapper[4975]: I0318 12:32:42.020552 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7j7l\" (UniqueName: \"kubernetes.io/projected/4760d40e-f9c9-4952-9773-3af1740da52a-kube-api-access-x7j7l\") pod \"root-account-create-update-fxwlg\" (UID: \"4760d40e-f9c9-4952-9773-3af1740da52a\") " pod="openstack/root-account-create-update-fxwlg" Mar 18 12:32:42 crc kubenswrapper[4975]: I0318 12:32:42.122564 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7j7l\" (UniqueName: \"kubernetes.io/projected/4760d40e-f9c9-4952-9773-3af1740da52a-kube-api-access-x7j7l\") pod \"root-account-create-update-fxwlg\" (UID: \"4760d40e-f9c9-4952-9773-3af1740da52a\") " pod="openstack/root-account-create-update-fxwlg" Mar 18 12:32:42 crc kubenswrapper[4975]: I0318 12:32:42.122659 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4760d40e-f9c9-4952-9773-3af1740da52a-operator-scripts\") pod \"root-account-create-update-fxwlg\" (UID: \"4760d40e-f9c9-4952-9773-3af1740da52a\") " pod="openstack/root-account-create-update-fxwlg" Mar 18 12:32:42 crc kubenswrapper[4975]: I0318 12:32:42.124347 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4760d40e-f9c9-4952-9773-3af1740da52a-operator-scripts\") pod \"root-account-create-update-fxwlg\" (UID: \"4760d40e-f9c9-4952-9773-3af1740da52a\") " pod="openstack/root-account-create-update-fxwlg" Mar 18 12:32:42 crc kubenswrapper[4975]: I0318 12:32:42.146486 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7j7l\" (UniqueName: \"kubernetes.io/projected/4760d40e-f9c9-4952-9773-3af1740da52a-kube-api-access-x7j7l\") pod \"root-account-create-update-fxwlg\" (UID: \"4760d40e-f9c9-4952-9773-3af1740da52a\") " pod="openstack/root-account-create-update-fxwlg" Mar 18 12:32:42 crc kubenswrapper[4975]: I0318 12:32:42.213882 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fxwlg" Mar 18 12:32:42 crc kubenswrapper[4975]: I0318 12:32:42.436905 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 12:32:42 crc kubenswrapper[4975]: I0318 12:32:42.667844 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fxwlg"] Mar 18 12:32:42 crc kubenswrapper[4975]: W0318 12:32:42.678452 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4760d40e_f9c9_4952_9773_3af1740da52a.slice/crio-2b3eb6479c4a7c976c1635439da644227d87bb1112edd02a191496b426deea2c WatchSource:0}: Error finding container 2b3eb6479c4a7c976c1635439da644227d87bb1112edd02a191496b426deea2c: Status 404 returned error can't find the container with id 2b3eb6479c4a7c976c1635439da644227d87bb1112edd02a191496b426deea2c Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.335027 4975 generic.go:334] "Generic (PLEG): container finished" podID="4760d40e-f9c9-4952-9773-3af1740da52a" containerID="0c0bd55712354453b90292fa5a312a6eafba86a9a67e8a425b79178102fd066e" exitCode=0 Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.335070 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fxwlg" event={"ID":"4760d40e-f9c9-4952-9773-3af1740da52a","Type":"ContainerDied","Data":"0c0bd55712354453b90292fa5a312a6eafba86a9a67e8a425b79178102fd066e"} Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.335111 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fxwlg" event={"ID":"4760d40e-f9c9-4952-9773-3af1740da52a","Type":"ContainerStarted","Data":"2b3eb6479c4a7c976c1635439da644227d87bb1112edd02a191496b426deea2c"} Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.557614 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-s4mlz"] Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.559046 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s4mlz" Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.564775 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-s4mlz"] Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.661396 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3f9f8bf-5111-465b-b099-5ea2b374ddc9-operator-scripts\") pod \"glance-db-create-s4mlz\" (UID: \"b3f9f8bf-5111-465b-b099-5ea2b374ddc9\") " pod="openstack/glance-db-create-s4mlz" Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.661807 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhvkz\" (UniqueName: \"kubernetes.io/projected/b3f9f8bf-5111-465b-b099-5ea2b374ddc9-kube-api-access-zhvkz\") pod \"glance-db-create-s4mlz\" (UID: \"b3f9f8bf-5111-465b-b099-5ea2b374ddc9\") " pod="openstack/glance-db-create-s4mlz" Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.740100 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2797-account-create-update-l44sp"] Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.741384 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2797-account-create-update-l44sp" Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.743934 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.746441 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2797-account-create-update-l44sp"] Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.764086 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhvkz\" (UniqueName: \"kubernetes.io/projected/b3f9f8bf-5111-465b-b099-5ea2b374ddc9-kube-api-access-zhvkz\") pod \"glance-db-create-s4mlz\" (UID: \"b3f9f8bf-5111-465b-b099-5ea2b374ddc9\") " pod="openstack/glance-db-create-s4mlz" Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.764139 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3f9f8bf-5111-465b-b099-5ea2b374ddc9-operator-scripts\") pod \"glance-db-create-s4mlz\" (UID: \"b3f9f8bf-5111-465b-b099-5ea2b374ddc9\") " pod="openstack/glance-db-create-s4mlz" Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.764968 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3f9f8bf-5111-465b-b099-5ea2b374ddc9-operator-scripts\") pod \"glance-db-create-s4mlz\" (UID: \"b3f9f8bf-5111-465b-b099-5ea2b374ddc9\") " pod="openstack/glance-db-create-s4mlz" Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.784267 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhvkz\" (UniqueName: \"kubernetes.io/projected/b3f9f8bf-5111-465b-b099-5ea2b374ddc9-kube-api-access-zhvkz\") pod \"glance-db-create-s4mlz\" (UID: \"b3f9f8bf-5111-465b-b099-5ea2b374ddc9\") " pod="openstack/glance-db-create-s4mlz" Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.873321 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vrtb\" (UniqueName: \"kubernetes.io/projected/fed45251-cc94-42f5-962c-d82dbd50b421-kube-api-access-9vrtb\") pod \"glance-2797-account-create-update-l44sp\" (UID: \"fed45251-cc94-42f5-962c-d82dbd50b421\") " pod="openstack/glance-2797-account-create-update-l44sp" Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.873824 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fed45251-cc94-42f5-962c-d82dbd50b421-operator-scripts\") pod \"glance-2797-account-create-update-l44sp\" (UID: \"fed45251-cc94-42f5-962c-d82dbd50b421\") " pod="openstack/glance-2797-account-create-update-l44sp" Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.877549 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s4mlz" Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.975479 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vrtb\" (UniqueName: \"kubernetes.io/projected/fed45251-cc94-42f5-962c-d82dbd50b421-kube-api-access-9vrtb\") pod \"glance-2797-account-create-update-l44sp\" (UID: \"fed45251-cc94-42f5-962c-d82dbd50b421\") " pod="openstack/glance-2797-account-create-update-l44sp" Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.975619 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fed45251-cc94-42f5-962c-d82dbd50b421-operator-scripts\") pod \"glance-2797-account-create-update-l44sp\" (UID: \"fed45251-cc94-42f5-962c-d82dbd50b421\") " pod="openstack/glance-2797-account-create-update-l44sp" Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.976321 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fed45251-cc94-42f5-962c-d82dbd50b421-operator-scripts\") pod \"glance-2797-account-create-update-l44sp\" (UID: \"fed45251-cc94-42f5-962c-d82dbd50b421\") " pod="openstack/glance-2797-account-create-update-l44sp" Mar 18 12:32:43 crc kubenswrapper[4975]: I0318 12:32:43.996663 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vrtb\" (UniqueName: \"kubernetes.io/projected/fed45251-cc94-42f5-962c-d82dbd50b421-kube-api-access-9vrtb\") pod \"glance-2797-account-create-update-l44sp\" (UID: \"fed45251-cc94-42f5-962c-d82dbd50b421\") " pod="openstack/glance-2797-account-create-update-l44sp" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.060981 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2797-account-create-update-l44sp" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.368250 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-259gn"] Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.369551 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-259gn" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.380088 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-259gn"] Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.475647 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9e84-account-create-update-sn9bw"] Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.477007 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9e84-account-create-update-sn9bw" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.479896 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.483458 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/973c781e-29cc-4d02-a22e-1b13a1a18a94-operator-scripts\") pod \"keystone-db-create-259gn\" (UID: \"973c781e-29cc-4d02-a22e-1b13a1a18a94\") " pod="openstack/keystone-db-create-259gn" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.483548 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74gg8\" (UniqueName: \"kubernetes.io/projected/973c781e-29cc-4d02-a22e-1b13a1a18a94-kube-api-access-74gg8\") pod \"keystone-db-create-259gn\" (UID: \"973c781e-29cc-4d02-a22e-1b13a1a18a94\") " pod="openstack/keystone-db-create-259gn" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.484037 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9e84-account-create-update-sn9bw"] Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.585439 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74gg8\" (UniqueName: \"kubernetes.io/projected/973c781e-29cc-4d02-a22e-1b13a1a18a94-kube-api-access-74gg8\") pod \"keystone-db-create-259gn\" (UID: \"973c781e-29cc-4d02-a22e-1b13a1a18a94\") " pod="openstack/keystone-db-create-259gn" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.585596 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5797d8e6-add0-482e-ab94-24df08d4da60-operator-scripts\") pod \"keystone-9e84-account-create-update-sn9bw\" (UID: \"5797d8e6-add0-482e-ab94-24df08d4da60\") " pod="openstack/keystone-9e84-account-create-update-sn9bw" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.585655 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/973c781e-29cc-4d02-a22e-1b13a1a18a94-operator-scripts\") pod \"keystone-db-create-259gn\" (UID: \"973c781e-29cc-4d02-a22e-1b13a1a18a94\") " pod="openstack/keystone-db-create-259gn" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.585694 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5qcm\" (UniqueName: \"kubernetes.io/projected/5797d8e6-add0-482e-ab94-24df08d4da60-kube-api-access-n5qcm\") pod \"keystone-9e84-account-create-update-sn9bw\" (UID: \"5797d8e6-add0-482e-ab94-24df08d4da60\") " pod="openstack/keystone-9e84-account-create-update-sn9bw" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.586874 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/973c781e-29cc-4d02-a22e-1b13a1a18a94-operator-scripts\") pod \"keystone-db-create-259gn\" (UID: \"973c781e-29cc-4d02-a22e-1b13a1a18a94\") " pod="openstack/keystone-db-create-259gn" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.622984 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74gg8\" (UniqueName: \"kubernetes.io/projected/973c781e-29cc-4d02-a22e-1b13a1a18a94-kube-api-access-74gg8\") pod \"keystone-db-create-259gn\" (UID: \"973c781e-29cc-4d02-a22e-1b13a1a18a94\") " pod="openstack/keystone-db-create-259gn" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.678718 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-s8rhn"] Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.680095 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s8rhn" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.686670 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5qcm\" (UniqueName: \"kubernetes.io/projected/5797d8e6-add0-482e-ab94-24df08d4da60-kube-api-access-n5qcm\") pod \"keystone-9e84-account-create-update-sn9bw\" (UID: \"5797d8e6-add0-482e-ab94-24df08d4da60\") " pod="openstack/keystone-9e84-account-create-update-sn9bw" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.686793 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5797d8e6-add0-482e-ab94-24df08d4da60-operator-scripts\") pod \"keystone-9e84-account-create-update-sn9bw\" (UID: \"5797d8e6-add0-482e-ab94-24df08d4da60\") " pod="openstack/keystone-9e84-account-create-update-sn9bw" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.687814 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5797d8e6-add0-482e-ab94-24df08d4da60-operator-scripts\") pod \"keystone-9e84-account-create-update-sn9bw\" (UID: \"5797d8e6-add0-482e-ab94-24df08d4da60\") " pod="openstack/keystone-9e84-account-create-update-sn9bw" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.700707 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-259gn" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.707320 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3b17-account-create-update-rkbqb"] Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.708694 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b17-account-create-update-rkbqb" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.710503 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.714555 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5qcm\" (UniqueName: \"kubernetes.io/projected/5797d8e6-add0-482e-ab94-24df08d4da60-kube-api-access-n5qcm\") pod \"keystone-9e84-account-create-update-sn9bw\" (UID: \"5797d8e6-add0-482e-ab94-24df08d4da60\") " pod="openstack/keystone-9e84-account-create-update-sn9bw" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.723365 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-s8rhn"] Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.732695 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3b17-account-create-update-rkbqb"] Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.788694 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtzwz\" (UniqueName: \"kubernetes.io/projected/226bdbb1-a2c1-4bdf-a509-2aaed024a33e-kube-api-access-wtzwz\") pod \"placement-db-create-s8rhn\" (UID: \"226bdbb1-a2c1-4bdf-a509-2aaed024a33e\") " pod="openstack/placement-db-create-s8rhn" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.788791 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/226bdbb1-a2c1-4bdf-a509-2aaed024a33e-operator-scripts\") pod \"placement-db-create-s8rhn\" (UID: \"226bdbb1-a2c1-4bdf-a509-2aaed024a33e\") " pod="openstack/placement-db-create-s8rhn" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.802065 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9e84-account-create-update-sn9bw" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.890348 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82dsz\" (UniqueName: \"kubernetes.io/projected/3a114ef7-ae0b-4502-86f6-5cbacd642fff-kube-api-access-82dsz\") pod \"placement-3b17-account-create-update-rkbqb\" (UID: \"3a114ef7-ae0b-4502-86f6-5cbacd642fff\") " pod="openstack/placement-3b17-account-create-update-rkbqb" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.890409 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtzwz\" (UniqueName: \"kubernetes.io/projected/226bdbb1-a2c1-4bdf-a509-2aaed024a33e-kube-api-access-wtzwz\") pod \"placement-db-create-s8rhn\" (UID: \"226bdbb1-a2c1-4bdf-a509-2aaed024a33e\") " pod="openstack/placement-db-create-s8rhn" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.890509 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/226bdbb1-a2c1-4bdf-a509-2aaed024a33e-operator-scripts\") pod \"placement-db-create-s8rhn\" (UID: \"226bdbb1-a2c1-4bdf-a509-2aaed024a33e\") " pod="openstack/placement-db-create-s8rhn" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.890566 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a114ef7-ae0b-4502-86f6-5cbacd642fff-operator-scripts\") pod \"placement-3b17-account-create-update-rkbqb\" (UID: \"3a114ef7-ae0b-4502-86f6-5cbacd642fff\") " pod="openstack/placement-3b17-account-create-update-rkbqb" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.891232 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/226bdbb1-a2c1-4bdf-a509-2aaed024a33e-operator-scripts\") pod \"placement-db-create-s8rhn\" (UID: \"226bdbb1-a2c1-4bdf-a509-2aaed024a33e\") " pod="openstack/placement-db-create-s8rhn" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.907063 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtzwz\" (UniqueName: \"kubernetes.io/projected/226bdbb1-a2c1-4bdf-a509-2aaed024a33e-kube-api-access-wtzwz\") pod \"placement-db-create-s8rhn\" (UID: \"226bdbb1-a2c1-4bdf-a509-2aaed024a33e\") " pod="openstack/placement-db-create-s8rhn" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.991963 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82dsz\" (UniqueName: \"kubernetes.io/projected/3a114ef7-ae0b-4502-86f6-5cbacd642fff-kube-api-access-82dsz\") pod \"placement-3b17-account-create-update-rkbqb\" (UID: \"3a114ef7-ae0b-4502-86f6-5cbacd642fff\") " pod="openstack/placement-3b17-account-create-update-rkbqb" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.992037 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a114ef7-ae0b-4502-86f6-5cbacd642fff-operator-scripts\") pod \"placement-3b17-account-create-update-rkbqb\" (UID: \"3a114ef7-ae0b-4502-86f6-5cbacd642fff\") " pod="openstack/placement-3b17-account-create-update-rkbqb" Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.992072 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:44 crc kubenswrapper[4975]: E0318 12:32:44.992294 4975 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:32:44 crc kubenswrapper[4975]: E0318 12:32:44.992308 4975 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:32:44 crc kubenswrapper[4975]: E0318 12:32:44.992341 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift podName:18526bd6-7184-4e92-8bb6-f85ec1aa3f30 nodeName:}" failed. No retries permitted until 2026-03-18 12:32:52.992329417 +0000 UTC m=+1358.706729986 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift") pod "swift-storage-0" (UID: "18526bd6-7184-4e92-8bb6-f85ec1aa3f30") : configmap "swift-ring-files" not found Mar 18 12:32:44 crc kubenswrapper[4975]: I0318 12:32:44.992952 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a114ef7-ae0b-4502-86f6-5cbacd642fff-operator-scripts\") pod \"placement-3b17-account-create-update-rkbqb\" (UID: \"3a114ef7-ae0b-4502-86f6-5cbacd642fff\") " pod="openstack/placement-3b17-account-create-update-rkbqb" Mar 18 12:32:45 crc kubenswrapper[4975]: I0318 12:32:45.000851 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s8rhn" Mar 18 12:32:45 crc kubenswrapper[4975]: I0318 12:32:45.013240 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82dsz\" (UniqueName: \"kubernetes.io/projected/3a114ef7-ae0b-4502-86f6-5cbacd642fff-kube-api-access-82dsz\") pod \"placement-3b17-account-create-update-rkbqb\" (UID: \"3a114ef7-ae0b-4502-86f6-5cbacd642fff\") " pod="openstack/placement-3b17-account-create-update-rkbqb" Mar 18 12:32:45 crc kubenswrapper[4975]: I0318 12:32:45.057620 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b17-account-create-update-rkbqb" Mar 18 12:32:45 crc kubenswrapper[4975]: I0318 12:32:45.912835 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fxwlg" Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.011323 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4760d40e-f9c9-4952-9773-3af1740da52a-operator-scripts\") pod \"4760d40e-f9c9-4952-9773-3af1740da52a\" (UID: \"4760d40e-f9c9-4952-9773-3af1740da52a\") " Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.011719 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7j7l\" (UniqueName: \"kubernetes.io/projected/4760d40e-f9c9-4952-9773-3af1740da52a-kube-api-access-x7j7l\") pod \"4760d40e-f9c9-4952-9773-3af1740da52a\" (UID: \"4760d40e-f9c9-4952-9773-3af1740da52a\") " Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.012703 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4760d40e-f9c9-4952-9773-3af1740da52a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4760d40e-f9c9-4952-9773-3af1740da52a" (UID: "4760d40e-f9c9-4952-9773-3af1740da52a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.020168 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4760d40e-f9c9-4952-9773-3af1740da52a-kube-api-access-x7j7l" (OuterVolumeSpecName: "kube-api-access-x7j7l") pod "4760d40e-f9c9-4952-9773-3af1740da52a" (UID: "4760d40e-f9c9-4952-9773-3af1740da52a"). InnerVolumeSpecName "kube-api-access-x7j7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.114514 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4760d40e-f9c9-4952-9773-3af1740da52a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.114567 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7j7l\" (UniqueName: \"kubernetes.io/projected/4760d40e-f9c9-4952-9773-3af1740da52a-kube-api-access-x7j7l\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.329052 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.361284 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-s4mlz"] Mar 18 12:32:46 crc kubenswrapper[4975]: W0318 12:32:46.369788 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3f9f8bf_5111_465b_b099_5ea2b374ddc9.slice/crio-95757178e5e2bfb82192d7039a16ed2ab9674782aa92671404e2fc5a60afe0d9 WatchSource:0}: Error finding container 95757178e5e2bfb82192d7039a16ed2ab9674782aa92671404e2fc5a60afe0d9: Status 404 returned error can't find the container with id 95757178e5e2bfb82192d7039a16ed2ab9674782aa92671404e2fc5a60afe0d9 Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.428310 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fxwlg" event={"ID":"4760d40e-f9c9-4952-9773-3af1740da52a","Type":"ContainerDied","Data":"2b3eb6479c4a7c976c1635439da644227d87bb1112edd02a191496b426deea2c"} Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.428358 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b3eb6479c4a7c976c1635439da644227d87bb1112edd02a191496b426deea2c" Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.428445 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fxwlg" Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.441481 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-g5bjt" event={"ID":"f0f58451-e968-467f-8d95-7a4c5104ce12","Type":"ContainerStarted","Data":"24412bc7ab16647100cbf02dd9516a91a3c0232c945d1610ba6eb2e2364b9baf"} Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.471436 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jp47c"] Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.471737 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" podUID="2e52b66d-395e-4753-9c2b-b271c2dedf36" containerName="dnsmasq-dns" containerID="cri-o://e3b35411eef529a5c14514cc07f0f7048c30c4a62a7ee80164fa29051cf3639d" gracePeriod=10 Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.480535 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9e84-account-create-update-sn9bw"] Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.495506 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-s8rhn"] Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.503345 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3b17-account-create-update-rkbqb"] Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.503605 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-g5bjt" podStartSLOduration=1.994578983 podStartE2EDuration="9.503592858s" podCreationTimestamp="2026-03-18 12:32:37 +0000 UTC" firstStartedPulling="2026-03-18 12:32:38.41934849 +0000 UTC m=+1344.133749069" lastFinishedPulling="2026-03-18 12:32:45.928362365 +0000 UTC m=+1351.642762944" observedRunningTime="2026-03-18 12:32:46.488263278 +0000 UTC m=+1352.202663857" watchObservedRunningTime="2026-03-18 12:32:46.503592858 +0000 UTC m=+1352.217993447" Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.685019 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-259gn"] Mar 18 12:32:46 crc kubenswrapper[4975]: I0318 12:32:46.697164 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2797-account-create-update-l44sp"] Mar 18 12:32:46 crc kubenswrapper[4975]: W0318 12:32:46.704733 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfed45251_cc94_42f5_962c_d82dbd50b421.slice/crio-2d15c959618a8cfacc4fce141764c64bf2fdd589162e4506da0abfd9912dfe6d WatchSource:0}: Error finding container 2d15c959618a8cfacc4fce141764c64bf2fdd589162e4506da0abfd9912dfe6d: Status 404 returned error can't find the container with id 2d15c959618a8cfacc4fce141764c64bf2fdd589162e4506da0abfd9912dfe6d Mar 18 12:32:46 crc kubenswrapper[4975]: W0318 12:32:46.716459 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod973c781e_29cc_4d02_a22e_1b13a1a18a94.slice/crio-5fd0106b579f9e1fd31a8989e0dc5a6ccdfb6e01bc6433985950ac5ff340f54c WatchSource:0}: Error finding container 5fd0106b579f9e1fd31a8989e0dc5a6ccdfb6e01bc6433985950ac5ff340f54c: Status 404 returned error can't find the container with id 5fd0106b579f9e1fd31a8989e0dc5a6ccdfb6e01bc6433985950ac5ff340f54c Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.336334 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.443698 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e52b66d-395e-4753-9c2b-b271c2dedf36-dns-svc\") pod \"2e52b66d-395e-4753-9c2b-b271c2dedf36\" (UID: \"2e52b66d-395e-4753-9c2b-b271c2dedf36\") " Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.443889 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xzft\" (UniqueName: \"kubernetes.io/projected/2e52b66d-395e-4753-9c2b-b271c2dedf36-kube-api-access-5xzft\") pod \"2e52b66d-395e-4753-9c2b-b271c2dedf36\" (UID: \"2e52b66d-395e-4753-9c2b-b271c2dedf36\") " Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.443937 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e52b66d-395e-4753-9c2b-b271c2dedf36-config\") pod \"2e52b66d-395e-4753-9c2b-b271c2dedf36\" (UID: \"2e52b66d-395e-4753-9c2b-b271c2dedf36\") " Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.450419 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e52b66d-395e-4753-9c2b-b271c2dedf36-kube-api-access-5xzft" (OuterVolumeSpecName: "kube-api-access-5xzft") pod "2e52b66d-395e-4753-9c2b-b271c2dedf36" (UID: "2e52b66d-395e-4753-9c2b-b271c2dedf36"). InnerVolumeSpecName "kube-api-access-5xzft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.460289 4975 generic.go:334] "Generic (PLEG): container finished" podID="fed45251-cc94-42f5-962c-d82dbd50b421" containerID="eb47ff94ddb709711a5e754dd2b67ab8628a9790f93a6ba0c102c759803cfc5d" exitCode=0 Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.460412 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2797-account-create-update-l44sp" event={"ID":"fed45251-cc94-42f5-962c-d82dbd50b421","Type":"ContainerDied","Data":"eb47ff94ddb709711a5e754dd2b67ab8628a9790f93a6ba0c102c759803cfc5d"} Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.460461 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2797-account-create-update-l44sp" event={"ID":"fed45251-cc94-42f5-962c-d82dbd50b421","Type":"ContainerStarted","Data":"2d15c959618a8cfacc4fce141764c64bf2fdd589162e4506da0abfd9912dfe6d"} Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.467105 4975 generic.go:334] "Generic (PLEG): container finished" podID="2e52b66d-395e-4753-9c2b-b271c2dedf36" containerID="e3b35411eef529a5c14514cc07f0f7048c30c4a62a7ee80164fa29051cf3639d" exitCode=0 Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.467184 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" event={"ID":"2e52b66d-395e-4753-9c2b-b271c2dedf36","Type":"ContainerDied","Data":"e3b35411eef529a5c14514cc07f0f7048c30c4a62a7ee80164fa29051cf3639d"} Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.467220 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" event={"ID":"2e52b66d-395e-4753-9c2b-b271c2dedf36","Type":"ContainerDied","Data":"98943a6bf34a1e9e517258d7ab9b3136e549a4f973ad1c2442c28fec96876439"} Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.467242 4975 scope.go:117] "RemoveContainer" containerID="e3b35411eef529a5c14514cc07f0f7048c30c4a62a7ee80164fa29051cf3639d" Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.467245 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jp47c" Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.474098 4975 generic.go:334] "Generic (PLEG): container finished" podID="973c781e-29cc-4d02-a22e-1b13a1a18a94" containerID="2b1e77e0028326720b11a64c53e4be5cf5ffa3a3c2f77b0e75c3f2d06507cfd7" exitCode=0 Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.474175 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-259gn" event={"ID":"973c781e-29cc-4d02-a22e-1b13a1a18a94","Type":"ContainerDied","Data":"2b1e77e0028326720b11a64c53e4be5cf5ffa3a3c2f77b0e75c3f2d06507cfd7"} Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.474207 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-259gn" event={"ID":"973c781e-29cc-4d02-a22e-1b13a1a18a94","Type":"ContainerStarted","Data":"5fd0106b579f9e1fd31a8989e0dc5a6ccdfb6e01bc6433985950ac5ff340f54c"} Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.483506 4975 generic.go:334] "Generic (PLEG): container finished" podID="5797d8e6-add0-482e-ab94-24df08d4da60" containerID="1b6219446d9d27ddd47965a350cbca26227136a905d64632202c3bd6d80d3a88" exitCode=0 Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.483631 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9e84-account-create-update-sn9bw" event={"ID":"5797d8e6-add0-482e-ab94-24df08d4da60","Type":"ContainerDied","Data":"1b6219446d9d27ddd47965a350cbca26227136a905d64632202c3bd6d80d3a88"} Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.483666 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9e84-account-create-update-sn9bw" event={"ID":"5797d8e6-add0-482e-ab94-24df08d4da60","Type":"ContainerStarted","Data":"8470b38b95e261e2d59ab265175b2d5e52284dac50dcfc740ca247f694767002"} Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.488562 4975 generic.go:334] "Generic (PLEG): container finished" podID="226bdbb1-a2c1-4bdf-a509-2aaed024a33e" containerID="559409cdd578ac7ebca41857ea3d8fe3faadf749b271bcaffe3bcc2c0c935d2d" exitCode=0 Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.488636 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s8rhn" event={"ID":"226bdbb1-a2c1-4bdf-a509-2aaed024a33e","Type":"ContainerDied","Data":"559409cdd578ac7ebca41857ea3d8fe3faadf749b271bcaffe3bcc2c0c935d2d"} Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.488666 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s8rhn" event={"ID":"226bdbb1-a2c1-4bdf-a509-2aaed024a33e","Type":"ContainerStarted","Data":"d22aa0aa85b2d997fc65d0f67e58a8cfca34cc20b23fbc9db110c5490299fc3e"} Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.490312 4975 generic.go:334] "Generic (PLEG): container finished" podID="b3f9f8bf-5111-465b-b099-5ea2b374ddc9" containerID="2a3c5218797b4769d79c2900fd0fd46380c935d0edaaa8f891be1220154b41d7" exitCode=0 Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.490396 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s4mlz" event={"ID":"b3f9f8bf-5111-465b-b099-5ea2b374ddc9","Type":"ContainerDied","Data":"2a3c5218797b4769d79c2900fd0fd46380c935d0edaaa8f891be1220154b41d7"} Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.490463 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s4mlz" event={"ID":"b3f9f8bf-5111-465b-b099-5ea2b374ddc9","Type":"ContainerStarted","Data":"95757178e5e2bfb82192d7039a16ed2ab9674782aa92671404e2fc5a60afe0d9"} Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.491919 4975 generic.go:334] "Generic (PLEG): container finished" podID="3a114ef7-ae0b-4502-86f6-5cbacd642fff" containerID="26b815d260ed6844b03fe269f50a147f3255454615c68a7e86e847c3f89df0e1" exitCode=0 Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.492942 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3b17-account-create-update-rkbqb" event={"ID":"3a114ef7-ae0b-4502-86f6-5cbacd642fff","Type":"ContainerDied","Data":"26b815d260ed6844b03fe269f50a147f3255454615c68a7e86e847c3f89df0e1"} Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.492969 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3b17-account-create-update-rkbqb" event={"ID":"3a114ef7-ae0b-4502-86f6-5cbacd642fff","Type":"ContainerStarted","Data":"f43e5ec9d1ac96ce9d260a6a2be4faa76f24ef8923e9755a5527d45df1987b4c"} Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.523023 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e52b66d-395e-4753-9c2b-b271c2dedf36-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e52b66d-395e-4753-9c2b-b271c2dedf36" (UID: "2e52b66d-395e-4753-9c2b-b271c2dedf36"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.530437 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e52b66d-395e-4753-9c2b-b271c2dedf36-config" (OuterVolumeSpecName: "config") pod "2e52b66d-395e-4753-9c2b-b271c2dedf36" (UID: "2e52b66d-395e-4753-9c2b-b271c2dedf36"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.545727 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e52b66d-395e-4753-9c2b-b271c2dedf36-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.545766 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e52b66d-395e-4753-9c2b-b271c2dedf36-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.545779 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xzft\" (UniqueName: \"kubernetes.io/projected/2e52b66d-395e-4753-9c2b-b271c2dedf36-kube-api-access-5xzft\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.550251 4975 scope.go:117] "RemoveContainer" containerID="bcf917c68bbda5cfa57843e462a3059dc2e5571b4c1df2edad7eea6f95e6fb34" Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.574273 4975 scope.go:117] "RemoveContainer" containerID="e3b35411eef529a5c14514cc07f0f7048c30c4a62a7ee80164fa29051cf3639d" Mar 18 12:32:47 crc kubenswrapper[4975]: E0318 12:32:47.575240 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3b35411eef529a5c14514cc07f0f7048c30c4a62a7ee80164fa29051cf3639d\": container with ID starting with e3b35411eef529a5c14514cc07f0f7048c30c4a62a7ee80164fa29051cf3639d not found: ID does not exist" containerID="e3b35411eef529a5c14514cc07f0f7048c30c4a62a7ee80164fa29051cf3639d" Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.575300 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b35411eef529a5c14514cc07f0f7048c30c4a62a7ee80164fa29051cf3639d"} err="failed to get container status \"e3b35411eef529a5c14514cc07f0f7048c30c4a62a7ee80164fa29051cf3639d\": rpc error: code = NotFound desc = could not find container \"e3b35411eef529a5c14514cc07f0f7048c30c4a62a7ee80164fa29051cf3639d\": container with ID starting with e3b35411eef529a5c14514cc07f0f7048c30c4a62a7ee80164fa29051cf3639d not found: ID does not exist" Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.575344 4975 scope.go:117] "RemoveContainer" containerID="bcf917c68bbda5cfa57843e462a3059dc2e5571b4c1df2edad7eea6f95e6fb34" Mar 18 12:32:47 crc kubenswrapper[4975]: E0318 12:32:47.575595 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcf917c68bbda5cfa57843e462a3059dc2e5571b4c1df2edad7eea6f95e6fb34\": container with ID starting with bcf917c68bbda5cfa57843e462a3059dc2e5571b4c1df2edad7eea6f95e6fb34 not found: ID does not exist" containerID="bcf917c68bbda5cfa57843e462a3059dc2e5571b4c1df2edad7eea6f95e6fb34" Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.575620 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcf917c68bbda5cfa57843e462a3059dc2e5571b4c1df2edad7eea6f95e6fb34"} err="failed to get container status \"bcf917c68bbda5cfa57843e462a3059dc2e5571b4c1df2edad7eea6f95e6fb34\": rpc error: code = NotFound desc = could not find container \"bcf917c68bbda5cfa57843e462a3059dc2e5571b4c1df2edad7eea6f95e6fb34\": container with ID starting with bcf917c68bbda5cfa57843e462a3059dc2e5571b4c1df2edad7eea6f95e6fb34 not found: ID does not exist" Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.801508 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jp47c"] Mar 18 12:32:47 crc kubenswrapper[4975]: I0318 12:32:47.807827 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jp47c"] Mar 18 12:32:48 crc kubenswrapper[4975]: I0318 12:32:48.905607 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s8rhn" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.031646 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e52b66d-395e-4753-9c2b-b271c2dedf36" path="/var/lib/kubelet/pods/2e52b66d-395e-4753-9c2b-b271c2dedf36/volumes" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.069759 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtzwz\" (UniqueName: \"kubernetes.io/projected/226bdbb1-a2c1-4bdf-a509-2aaed024a33e-kube-api-access-wtzwz\") pod \"226bdbb1-a2c1-4bdf-a509-2aaed024a33e\" (UID: \"226bdbb1-a2c1-4bdf-a509-2aaed024a33e\") " Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.070179 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/226bdbb1-a2c1-4bdf-a509-2aaed024a33e-operator-scripts\") pod \"226bdbb1-a2c1-4bdf-a509-2aaed024a33e\" (UID: \"226bdbb1-a2c1-4bdf-a509-2aaed024a33e\") " Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.071060 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/226bdbb1-a2c1-4bdf-a509-2aaed024a33e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "226bdbb1-a2c1-4bdf-a509-2aaed024a33e" (UID: "226bdbb1-a2c1-4bdf-a509-2aaed024a33e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.076628 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/226bdbb1-a2c1-4bdf-a509-2aaed024a33e-kube-api-access-wtzwz" (OuterVolumeSpecName: "kube-api-access-wtzwz") pod "226bdbb1-a2c1-4bdf-a509-2aaed024a33e" (UID: "226bdbb1-a2c1-4bdf-a509-2aaed024a33e"). InnerVolumeSpecName "kube-api-access-wtzwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.172454 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtzwz\" (UniqueName: \"kubernetes.io/projected/226bdbb1-a2c1-4bdf-a509-2aaed024a33e-kube-api-access-wtzwz\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.172496 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/226bdbb1-a2c1-4bdf-a509-2aaed024a33e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.206044 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9e84-account-create-update-sn9bw" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.214271 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b17-account-create-update-rkbqb" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.226308 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2797-account-create-update-l44sp" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.233164 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-259gn" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.237369 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s4mlz" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.374819 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a114ef7-ae0b-4502-86f6-5cbacd642fff-operator-scripts\") pod \"3a114ef7-ae0b-4502-86f6-5cbacd642fff\" (UID: \"3a114ef7-ae0b-4502-86f6-5cbacd642fff\") " Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.375143 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82dsz\" (UniqueName: \"kubernetes.io/projected/3a114ef7-ae0b-4502-86f6-5cbacd642fff-kube-api-access-82dsz\") pod \"3a114ef7-ae0b-4502-86f6-5cbacd642fff\" (UID: \"3a114ef7-ae0b-4502-86f6-5cbacd642fff\") " Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.375254 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74gg8\" (UniqueName: \"kubernetes.io/projected/973c781e-29cc-4d02-a22e-1b13a1a18a94-kube-api-access-74gg8\") pod \"973c781e-29cc-4d02-a22e-1b13a1a18a94\" (UID: \"973c781e-29cc-4d02-a22e-1b13a1a18a94\") " Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.375365 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vrtb\" (UniqueName: \"kubernetes.io/projected/fed45251-cc94-42f5-962c-d82dbd50b421-kube-api-access-9vrtb\") pod \"fed45251-cc94-42f5-962c-d82dbd50b421\" (UID: \"fed45251-cc94-42f5-962c-d82dbd50b421\") " Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.375439 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/973c781e-29cc-4d02-a22e-1b13a1a18a94-operator-scripts\") pod \"973c781e-29cc-4d02-a22e-1b13a1a18a94\" (UID: \"973c781e-29cc-4d02-a22e-1b13a1a18a94\") " Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.375581 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhvkz\" (UniqueName: \"kubernetes.io/projected/b3f9f8bf-5111-465b-b099-5ea2b374ddc9-kube-api-access-zhvkz\") pod \"b3f9f8bf-5111-465b-b099-5ea2b374ddc9\" (UID: \"b3f9f8bf-5111-465b-b099-5ea2b374ddc9\") " Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.375702 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fed45251-cc94-42f5-962c-d82dbd50b421-operator-scripts\") pod \"fed45251-cc94-42f5-962c-d82dbd50b421\" (UID: \"fed45251-cc94-42f5-962c-d82dbd50b421\") " Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.375815 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3f9f8bf-5111-465b-b099-5ea2b374ddc9-operator-scripts\") pod \"b3f9f8bf-5111-465b-b099-5ea2b374ddc9\" (UID: \"b3f9f8bf-5111-465b-b099-5ea2b374ddc9\") " Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.375947 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5797d8e6-add0-482e-ab94-24df08d4da60-operator-scripts\") pod \"5797d8e6-add0-482e-ab94-24df08d4da60\" (UID: \"5797d8e6-add0-482e-ab94-24df08d4da60\") " Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.376063 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5qcm\" (UniqueName: \"kubernetes.io/projected/5797d8e6-add0-482e-ab94-24df08d4da60-kube-api-access-n5qcm\") pod \"5797d8e6-add0-482e-ab94-24df08d4da60\" (UID: \"5797d8e6-add0-482e-ab94-24df08d4da60\") " Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.375278 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a114ef7-ae0b-4502-86f6-5cbacd642fff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3a114ef7-ae0b-4502-86f6-5cbacd642fff" (UID: "3a114ef7-ae0b-4502-86f6-5cbacd642fff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.376710 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/973c781e-29cc-4d02-a22e-1b13a1a18a94-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "973c781e-29cc-4d02-a22e-1b13a1a18a94" (UID: "973c781e-29cc-4d02-a22e-1b13a1a18a94"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.376798 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fed45251-cc94-42f5-962c-d82dbd50b421-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fed45251-cc94-42f5-962c-d82dbd50b421" (UID: "fed45251-cc94-42f5-962c-d82dbd50b421"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.376826 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3f9f8bf-5111-465b-b099-5ea2b374ddc9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b3f9f8bf-5111-465b-b099-5ea2b374ddc9" (UID: "b3f9f8bf-5111-465b-b099-5ea2b374ddc9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.377365 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5797d8e6-add0-482e-ab94-24df08d4da60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5797d8e6-add0-482e-ab94-24df08d4da60" (UID: "5797d8e6-add0-482e-ab94-24df08d4da60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.380001 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed45251-cc94-42f5-962c-d82dbd50b421-kube-api-access-9vrtb" (OuterVolumeSpecName: "kube-api-access-9vrtb") pod "fed45251-cc94-42f5-962c-d82dbd50b421" (UID: "fed45251-cc94-42f5-962c-d82dbd50b421"). InnerVolumeSpecName "kube-api-access-9vrtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.380522 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5797d8e6-add0-482e-ab94-24df08d4da60-kube-api-access-n5qcm" (OuterVolumeSpecName: "kube-api-access-n5qcm") pod "5797d8e6-add0-482e-ab94-24df08d4da60" (UID: "5797d8e6-add0-482e-ab94-24df08d4da60"). InnerVolumeSpecName "kube-api-access-n5qcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.381479 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/973c781e-29cc-4d02-a22e-1b13a1a18a94-kube-api-access-74gg8" (OuterVolumeSpecName: "kube-api-access-74gg8") pod "973c781e-29cc-4d02-a22e-1b13a1a18a94" (UID: "973c781e-29cc-4d02-a22e-1b13a1a18a94"). InnerVolumeSpecName "kube-api-access-74gg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.382351 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a114ef7-ae0b-4502-86f6-5cbacd642fff-kube-api-access-82dsz" (OuterVolumeSpecName: "kube-api-access-82dsz") pod "3a114ef7-ae0b-4502-86f6-5cbacd642fff" (UID: "3a114ef7-ae0b-4502-86f6-5cbacd642fff"). InnerVolumeSpecName "kube-api-access-82dsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.382367 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3f9f8bf-5111-465b-b099-5ea2b374ddc9-kube-api-access-zhvkz" (OuterVolumeSpecName: "kube-api-access-zhvkz") pod "b3f9f8bf-5111-465b-b099-5ea2b374ddc9" (UID: "b3f9f8bf-5111-465b-b099-5ea2b374ddc9"). InnerVolumeSpecName "kube-api-access-zhvkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.478110 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vrtb\" (UniqueName: \"kubernetes.io/projected/fed45251-cc94-42f5-962c-d82dbd50b421-kube-api-access-9vrtb\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.478158 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/973c781e-29cc-4d02-a22e-1b13a1a18a94-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.478168 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhvkz\" (UniqueName: \"kubernetes.io/projected/b3f9f8bf-5111-465b-b099-5ea2b374ddc9-kube-api-access-zhvkz\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.478179 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fed45251-cc94-42f5-962c-d82dbd50b421-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.478187 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5797d8e6-add0-482e-ab94-24df08d4da60-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.478242 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b3f9f8bf-5111-465b-b099-5ea2b374ddc9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.478253 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5qcm\" (UniqueName: \"kubernetes.io/projected/5797d8e6-add0-482e-ab94-24df08d4da60-kube-api-access-n5qcm\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.478264 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a114ef7-ae0b-4502-86f6-5cbacd642fff-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.478273 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82dsz\" (UniqueName: \"kubernetes.io/projected/3a114ef7-ae0b-4502-86f6-5cbacd642fff-kube-api-access-82dsz\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.478281 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74gg8\" (UniqueName: \"kubernetes.io/projected/973c781e-29cc-4d02-a22e-1b13a1a18a94-kube-api-access-74gg8\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.511841 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3b17-account-create-update-rkbqb" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.511789 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3b17-account-create-update-rkbqb" event={"ID":"3a114ef7-ae0b-4502-86f6-5cbacd642fff","Type":"ContainerDied","Data":"f43e5ec9d1ac96ce9d260a6a2be4faa76f24ef8923e9755a5527d45df1987b4c"} Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.521218 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f43e5ec9d1ac96ce9d260a6a2be4faa76f24ef8923e9755a5527d45df1987b4c" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.522635 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2797-account-create-update-l44sp" event={"ID":"fed45251-cc94-42f5-962c-d82dbd50b421","Type":"ContainerDied","Data":"2d15c959618a8cfacc4fce141764c64bf2fdd589162e4506da0abfd9912dfe6d"} Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.522663 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d15c959618a8cfacc4fce141764c64bf2fdd589162e4506da0abfd9912dfe6d" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.522710 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2797-account-create-update-l44sp" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.526203 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-259gn" event={"ID":"973c781e-29cc-4d02-a22e-1b13a1a18a94","Type":"ContainerDied","Data":"5fd0106b579f9e1fd31a8989e0dc5a6ccdfb6e01bc6433985950ac5ff340f54c"} Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.526317 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fd0106b579f9e1fd31a8989e0dc5a6ccdfb6e01bc6433985950ac5ff340f54c" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.526411 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-259gn" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.535277 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9e84-account-create-update-sn9bw" event={"ID":"5797d8e6-add0-482e-ab94-24df08d4da60","Type":"ContainerDied","Data":"8470b38b95e261e2d59ab265175b2d5e52284dac50dcfc740ca247f694767002"} Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.535307 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8470b38b95e261e2d59ab265175b2d5e52284dac50dcfc740ca247f694767002" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.535416 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9e84-account-create-update-sn9bw" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.539185 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s8rhn" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.539487 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s8rhn" event={"ID":"226bdbb1-a2c1-4bdf-a509-2aaed024a33e","Type":"ContainerDied","Data":"d22aa0aa85b2d997fc65d0f67e58a8cfca34cc20b23fbc9db110c5490299fc3e"} Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.539514 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d22aa0aa85b2d997fc65d0f67e58a8cfca34cc20b23fbc9db110c5490299fc3e" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.540583 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s4mlz" event={"ID":"b3f9f8bf-5111-465b-b099-5ea2b374ddc9","Type":"ContainerDied","Data":"95757178e5e2bfb82192d7039a16ed2ab9674782aa92671404e2fc5a60afe0d9"} Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.540605 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95757178e5e2bfb82192d7039a16ed2ab9674782aa92671404e2fc5a60afe0d9" Mar 18 12:32:49 crc kubenswrapper[4975]: I0318 12:32:49.540612 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s4mlz" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.462638 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-fxwlg"] Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.472635 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-fxwlg"] Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.560158 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-mwfxc"] Mar 18 12:32:50 crc kubenswrapper[4975]: E0318 12:32:50.560600 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed45251-cc94-42f5-962c-d82dbd50b421" containerName="mariadb-account-create-update" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.560640 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed45251-cc94-42f5-962c-d82dbd50b421" containerName="mariadb-account-create-update" Mar 18 12:32:50 crc kubenswrapper[4975]: E0318 12:32:50.560672 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a114ef7-ae0b-4502-86f6-5cbacd642fff" containerName="mariadb-account-create-update" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.560681 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a114ef7-ae0b-4502-86f6-5cbacd642fff" containerName="mariadb-account-create-update" Mar 18 12:32:50 crc kubenswrapper[4975]: E0318 12:32:50.560695 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4760d40e-f9c9-4952-9773-3af1740da52a" containerName="mariadb-account-create-update" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.560703 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="4760d40e-f9c9-4952-9773-3af1740da52a" containerName="mariadb-account-create-update" Mar 18 12:32:50 crc kubenswrapper[4975]: E0318 12:32:50.560720 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="226bdbb1-a2c1-4bdf-a509-2aaed024a33e" containerName="mariadb-database-create" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.560727 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="226bdbb1-a2c1-4bdf-a509-2aaed024a33e" containerName="mariadb-database-create" Mar 18 12:32:50 crc kubenswrapper[4975]: E0318 12:32:50.560740 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3f9f8bf-5111-465b-b099-5ea2b374ddc9" containerName="mariadb-database-create" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.560747 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f9f8bf-5111-465b-b099-5ea2b374ddc9" containerName="mariadb-database-create" Mar 18 12:32:50 crc kubenswrapper[4975]: E0318 12:32:50.560762 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e52b66d-395e-4753-9c2b-b271c2dedf36" containerName="init" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.560772 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e52b66d-395e-4753-9c2b-b271c2dedf36" containerName="init" Mar 18 12:32:50 crc kubenswrapper[4975]: E0318 12:32:50.560783 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e52b66d-395e-4753-9c2b-b271c2dedf36" containerName="dnsmasq-dns" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.560790 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e52b66d-395e-4753-9c2b-b271c2dedf36" containerName="dnsmasq-dns" Mar 18 12:32:50 crc kubenswrapper[4975]: E0318 12:32:50.560802 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973c781e-29cc-4d02-a22e-1b13a1a18a94" containerName="mariadb-database-create" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.560811 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="973c781e-29cc-4d02-a22e-1b13a1a18a94" containerName="mariadb-database-create" Mar 18 12:32:50 crc kubenswrapper[4975]: E0318 12:32:50.560843 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5797d8e6-add0-482e-ab94-24df08d4da60" containerName="mariadb-account-create-update" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.560874 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="5797d8e6-add0-482e-ab94-24df08d4da60" containerName="mariadb-account-create-update" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.561104 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="973c781e-29cc-4d02-a22e-1b13a1a18a94" containerName="mariadb-database-create" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.561119 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a114ef7-ae0b-4502-86f6-5cbacd642fff" containerName="mariadb-account-create-update" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.561130 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="226bdbb1-a2c1-4bdf-a509-2aaed024a33e" containerName="mariadb-database-create" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.561143 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="fed45251-cc94-42f5-962c-d82dbd50b421" containerName="mariadb-account-create-update" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.561154 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="4760d40e-f9c9-4952-9773-3af1740da52a" containerName="mariadb-account-create-update" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.561167 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="5797d8e6-add0-482e-ab94-24df08d4da60" containerName="mariadb-account-create-update" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.561185 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3f9f8bf-5111-465b-b099-5ea2b374ddc9" containerName="mariadb-database-create" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.561197 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e52b66d-395e-4753-9c2b-b271c2dedf36" containerName="dnsmasq-dns" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.561909 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mwfxc" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.565660 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.567464 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mwfxc"] Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.695904 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d954ac44-bc3a-44bf-a690-e248ed937c76-operator-scripts\") pod \"root-account-create-update-mwfxc\" (UID: \"d954ac44-bc3a-44bf-a690-e248ed937c76\") " pod="openstack/root-account-create-update-mwfxc" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.696306 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq4gp\" (UniqueName: \"kubernetes.io/projected/d954ac44-bc3a-44bf-a690-e248ed937c76-kube-api-access-tq4gp\") pod \"root-account-create-update-mwfxc\" (UID: \"d954ac44-bc3a-44bf-a690-e248ed937c76\") " pod="openstack/root-account-create-update-mwfxc" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.797316 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq4gp\" (UniqueName: \"kubernetes.io/projected/d954ac44-bc3a-44bf-a690-e248ed937c76-kube-api-access-tq4gp\") pod \"root-account-create-update-mwfxc\" (UID: \"d954ac44-bc3a-44bf-a690-e248ed937c76\") " pod="openstack/root-account-create-update-mwfxc" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.797459 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d954ac44-bc3a-44bf-a690-e248ed937c76-operator-scripts\") pod \"root-account-create-update-mwfxc\" (UID: \"d954ac44-bc3a-44bf-a690-e248ed937c76\") " pod="openstack/root-account-create-update-mwfxc" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.798305 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d954ac44-bc3a-44bf-a690-e248ed937c76-operator-scripts\") pod \"root-account-create-update-mwfxc\" (UID: \"d954ac44-bc3a-44bf-a690-e248ed937c76\") " pod="openstack/root-account-create-update-mwfxc" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.814017 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq4gp\" (UniqueName: \"kubernetes.io/projected/d954ac44-bc3a-44bf-a690-e248ed937c76-kube-api-access-tq4gp\") pod \"root-account-create-update-mwfxc\" (UID: \"d954ac44-bc3a-44bf-a690-e248ed937c76\") " pod="openstack/root-account-create-update-mwfxc" Mar 18 12:32:50 crc kubenswrapper[4975]: I0318 12:32:50.887212 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mwfxc" Mar 18 12:32:51 crc kubenswrapper[4975]: I0318 12:32:51.025593 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4760d40e-f9c9-4952-9773-3af1740da52a" path="/var/lib/kubelet/pods/4760d40e-f9c9-4952-9773-3af1740da52a/volumes" Mar 18 12:32:51 crc kubenswrapper[4975]: I0318 12:32:51.309050 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mwfxc"] Mar 18 12:32:51 crc kubenswrapper[4975]: I0318 12:32:51.558471 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mwfxc" event={"ID":"d954ac44-bc3a-44bf-a690-e248ed937c76","Type":"ContainerStarted","Data":"e485a5545c716abfab08afcd79fe927a554e6600a0b95914b16c2582e0c78313"} Mar 18 12:32:51 crc kubenswrapper[4975]: I0318 12:32:51.558510 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mwfxc" event={"ID":"d954ac44-bc3a-44bf-a690-e248ed937c76","Type":"ContainerStarted","Data":"ecc1f920f3d7a6b6d98a049798e8ce1d8934e605a97a1d3c59974b8c7c9505b8"} Mar 18 12:32:51 crc kubenswrapper[4975]: I0318 12:32:51.572062 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-mwfxc" podStartSLOduration=1.572042568 podStartE2EDuration="1.572042568s" podCreationTimestamp="2026-03-18 12:32:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:32:51.571583745 +0000 UTC m=+1357.285984334" watchObservedRunningTime="2026-03-18 12:32:51.572042568 +0000 UTC m=+1357.286443147" Mar 18 12:32:52 crc kubenswrapper[4975]: I0318 12:32:52.568394 4975 generic.go:334] "Generic (PLEG): container finished" podID="d954ac44-bc3a-44bf-a690-e248ed937c76" containerID="e485a5545c716abfab08afcd79fe927a554e6600a0b95914b16c2582e0c78313" exitCode=0 Mar 18 12:32:52 crc kubenswrapper[4975]: I0318 12:32:52.568446 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mwfxc" event={"ID":"d954ac44-bc3a-44bf-a690-e248ed937c76","Type":"ContainerDied","Data":"e485a5545c716abfab08afcd79fe927a554e6600a0b95914b16c2582e0c78313"} Mar 18 12:32:53 crc kubenswrapper[4975]: I0318 12:32:53.031485 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:32:53 crc kubenswrapper[4975]: E0318 12:32:53.031654 4975 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 12:32:53 crc kubenswrapper[4975]: E0318 12:32:53.031671 4975 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 12:32:53 crc kubenswrapper[4975]: E0318 12:32:53.031717 4975 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift podName:18526bd6-7184-4e92-8bb6-f85ec1aa3f30 nodeName:}" failed. No retries permitted until 2026-03-18 12:33:09.031703158 +0000 UTC m=+1374.746103737 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift") pod "swift-storage-0" (UID: "18526bd6-7184-4e92-8bb6-f85ec1aa3f30") : configmap "swift-ring-files" not found Mar 18 12:32:53 crc kubenswrapper[4975]: I0318 12:32:53.578512 4975 generic.go:334] "Generic (PLEG): container finished" podID="f0f58451-e968-467f-8d95-7a4c5104ce12" containerID="24412bc7ab16647100cbf02dd9516a91a3c0232c945d1610ba6eb2e2364b9baf" exitCode=0 Mar 18 12:32:53 crc kubenswrapper[4975]: I0318 12:32:53.578593 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-g5bjt" event={"ID":"f0f58451-e968-467f-8d95-7a4c5104ce12","Type":"ContainerDied","Data":"24412bc7ab16647100cbf02dd9516a91a3c0232c945d1610ba6eb2e2364b9baf"} Mar 18 12:32:53 crc kubenswrapper[4975]: I0318 12:32:53.829140 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-2zdjk"] Mar 18 12:32:53 crc kubenswrapper[4975]: I0318 12:32:53.830331 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2zdjk" Mar 18 12:32:53 crc kubenswrapper[4975]: I0318 12:32:53.839762 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2zdjk"] Mar 18 12:32:53 crc kubenswrapper[4975]: I0318 12:32:53.840272 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 12:32:53 crc kubenswrapper[4975]: I0318 12:32:53.840388 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dpz5h" Mar 18 12:32:53 crc kubenswrapper[4975]: I0318 12:32:53.945456 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-config-data\") pod \"glance-db-sync-2zdjk\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " pod="openstack/glance-db-sync-2zdjk" Mar 18 12:32:53 crc kubenswrapper[4975]: I0318 12:32:53.945497 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-combined-ca-bundle\") pod \"glance-db-sync-2zdjk\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " pod="openstack/glance-db-sync-2zdjk" Mar 18 12:32:53 crc kubenswrapper[4975]: I0318 12:32:53.945541 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-db-sync-config-data\") pod \"glance-db-sync-2zdjk\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " pod="openstack/glance-db-sync-2zdjk" Mar 18 12:32:53 crc kubenswrapper[4975]: I0318 12:32:53.945895 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9slb\" (UniqueName: \"kubernetes.io/projected/6e796a56-c2ec-40c2-8604-b14e71255013-kube-api-access-d9slb\") pod \"glance-db-sync-2zdjk\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " pod="openstack/glance-db-sync-2zdjk" Mar 18 12:32:53 crc kubenswrapper[4975]: I0318 12:32:53.983931 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mwfxc" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.047081 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq4gp\" (UniqueName: \"kubernetes.io/projected/d954ac44-bc3a-44bf-a690-e248ed937c76-kube-api-access-tq4gp\") pod \"d954ac44-bc3a-44bf-a690-e248ed937c76\" (UID: \"d954ac44-bc3a-44bf-a690-e248ed937c76\") " Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.047129 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d954ac44-bc3a-44bf-a690-e248ed937c76-operator-scripts\") pod \"d954ac44-bc3a-44bf-a690-e248ed937c76\" (UID: \"d954ac44-bc3a-44bf-a690-e248ed937c76\") " Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.047381 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9slb\" (UniqueName: \"kubernetes.io/projected/6e796a56-c2ec-40c2-8604-b14e71255013-kube-api-access-d9slb\") pod \"glance-db-sync-2zdjk\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " pod="openstack/glance-db-sync-2zdjk" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.047428 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-config-data\") pod \"glance-db-sync-2zdjk\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " pod="openstack/glance-db-sync-2zdjk" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.047445 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-combined-ca-bundle\") pod \"glance-db-sync-2zdjk\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " pod="openstack/glance-db-sync-2zdjk" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.047479 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-db-sync-config-data\") pod \"glance-db-sync-2zdjk\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " pod="openstack/glance-db-sync-2zdjk" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.048958 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d954ac44-bc3a-44bf-a690-e248ed937c76-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d954ac44-bc3a-44bf-a690-e248ed937c76" (UID: "d954ac44-bc3a-44bf-a690-e248ed937c76"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.055213 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d954ac44-bc3a-44bf-a690-e248ed937c76-kube-api-access-tq4gp" (OuterVolumeSpecName: "kube-api-access-tq4gp") pod "d954ac44-bc3a-44bf-a690-e248ed937c76" (UID: "d954ac44-bc3a-44bf-a690-e248ed937c76"). InnerVolumeSpecName "kube-api-access-tq4gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.058504 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-config-data\") pod \"glance-db-sync-2zdjk\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " pod="openstack/glance-db-sync-2zdjk" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.059091 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-combined-ca-bundle\") pod \"glance-db-sync-2zdjk\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " pod="openstack/glance-db-sync-2zdjk" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.065936 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-db-sync-config-data\") pod \"glance-db-sync-2zdjk\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " pod="openstack/glance-db-sync-2zdjk" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.071442 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9slb\" (UniqueName: \"kubernetes.io/projected/6e796a56-c2ec-40c2-8604-b14e71255013-kube-api-access-d9slb\") pod \"glance-db-sync-2zdjk\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " pod="openstack/glance-db-sync-2zdjk" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.149250 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq4gp\" (UniqueName: \"kubernetes.io/projected/d954ac44-bc3a-44bf-a690-e248ed937c76-kube-api-access-tq4gp\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.149293 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d954ac44-bc3a-44bf-a690-e248ed937c76-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.155536 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2zdjk" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.588432 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mwfxc" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.593034 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mwfxc" event={"ID":"d954ac44-bc3a-44bf-a690-e248ed937c76","Type":"ContainerDied","Data":"ecc1f920f3d7a6b6d98a049798e8ce1d8934e605a97a1d3c59974b8c7c9505b8"} Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.593106 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecc1f920f3d7a6b6d98a049798e8ce1d8934e605a97a1d3c59974b8c7c9505b8" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.755401 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2zdjk"] Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.878012 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.960547 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-combined-ca-bundle\") pod \"f0f58451-e968-467f-8d95-7a4c5104ce12\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.960609 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0f58451-e968-467f-8d95-7a4c5104ce12-scripts\") pod \"f0f58451-e968-467f-8d95-7a4c5104ce12\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.960634 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-swiftconf\") pod \"f0f58451-e968-467f-8d95-7a4c5104ce12\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.960677 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-dispersionconf\") pod \"f0f58451-e968-467f-8d95-7a4c5104ce12\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.960708 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f0f58451-e968-467f-8d95-7a4c5104ce12-etc-swift\") pod \"f0f58451-e968-467f-8d95-7a4c5104ce12\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.960781 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcm2l\" (UniqueName: \"kubernetes.io/projected/f0f58451-e968-467f-8d95-7a4c5104ce12-kube-api-access-tcm2l\") pod \"f0f58451-e968-467f-8d95-7a4c5104ce12\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.960843 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f0f58451-e968-467f-8d95-7a4c5104ce12-ring-data-devices\") pod \"f0f58451-e968-467f-8d95-7a4c5104ce12\" (UID: \"f0f58451-e968-467f-8d95-7a4c5104ce12\") " Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.961700 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0f58451-e968-467f-8d95-7a4c5104ce12-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f0f58451-e968-467f-8d95-7a4c5104ce12" (UID: "f0f58451-e968-467f-8d95-7a4c5104ce12"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.961810 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0f58451-e968-467f-8d95-7a4c5104ce12-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f0f58451-e968-467f-8d95-7a4c5104ce12" (UID: "f0f58451-e968-467f-8d95-7a4c5104ce12"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.966417 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f58451-e968-467f-8d95-7a4c5104ce12-kube-api-access-tcm2l" (OuterVolumeSpecName: "kube-api-access-tcm2l") pod "f0f58451-e968-467f-8d95-7a4c5104ce12" (UID: "f0f58451-e968-467f-8d95-7a4c5104ce12"). InnerVolumeSpecName "kube-api-access-tcm2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.968701 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f0f58451-e968-467f-8d95-7a4c5104ce12" (UID: "f0f58451-e968-467f-8d95-7a4c5104ce12"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.982808 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0f58451-e968-467f-8d95-7a4c5104ce12-scripts" (OuterVolumeSpecName: "scripts") pod "f0f58451-e968-467f-8d95-7a4c5104ce12" (UID: "f0f58451-e968-467f-8d95-7a4c5104ce12"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.985036 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f0f58451-e968-467f-8d95-7a4c5104ce12" (UID: "f0f58451-e968-467f-8d95-7a4c5104ce12"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:54 crc kubenswrapper[4975]: I0318 12:32:54.987225 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0f58451-e968-467f-8d95-7a4c5104ce12" (UID: "f0f58451-e968-467f-8d95-7a4c5104ce12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:32:55 crc kubenswrapper[4975]: I0318 12:32:55.063146 4975 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:55 crc kubenswrapper[4975]: I0318 12:32:55.063451 4975 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f0f58451-e968-467f-8d95-7a4c5104ce12-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:55 crc kubenswrapper[4975]: I0318 12:32:55.063466 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcm2l\" (UniqueName: \"kubernetes.io/projected/f0f58451-e968-467f-8d95-7a4c5104ce12-kube-api-access-tcm2l\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:55 crc kubenswrapper[4975]: I0318 12:32:55.063478 4975 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f0f58451-e968-467f-8d95-7a4c5104ce12-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:55 crc kubenswrapper[4975]: I0318 12:32:55.063488 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:55 crc kubenswrapper[4975]: I0318 12:32:55.063498 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0f58451-e968-467f-8d95-7a4c5104ce12-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:55 crc kubenswrapper[4975]: I0318 12:32:55.063507 4975 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f0f58451-e968-467f-8d95-7a4c5104ce12-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 12:32:55 crc kubenswrapper[4975]: I0318 12:32:55.539260 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:32:55 crc kubenswrapper[4975]: I0318 12:32:55.539330 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:32:55 crc kubenswrapper[4975]: I0318 12:32:55.595580 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-g5bjt" event={"ID":"f0f58451-e968-467f-8d95-7a4c5104ce12","Type":"ContainerDied","Data":"917625031b795ba1362b3c37aa959d0cf6b5b83587b095218c822ae7b15a0684"} Mar 18 12:32:55 crc kubenswrapper[4975]: I0318 12:32:55.595620 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="917625031b795ba1362b3c37aa959d0cf6b5b83587b095218c822ae7b15a0684" Mar 18 12:32:55 crc kubenswrapper[4975]: I0318 12:32:55.595669 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-g5bjt" Mar 18 12:32:55 crc kubenswrapper[4975]: I0318 12:32:55.598642 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2zdjk" event={"ID":"6e796a56-c2ec-40c2-8604-b14e71255013","Type":"ContainerStarted","Data":"a3de3462bfb98efca7c24b1be88995b1c4e9230fee9780cd764c65585db9c18f"} Mar 18 12:32:55 crc kubenswrapper[4975]: I0318 12:32:55.884773 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 12:32:56 crc kubenswrapper[4975]: I0318 12:32:56.929759 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-mwfxc"] Mar 18 12:32:56 crc kubenswrapper[4975]: I0318 12:32:56.938002 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-mwfxc"] Mar 18 12:32:57 crc kubenswrapper[4975]: I0318 12:32:57.040793 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d954ac44-bc3a-44bf-a690-e248ed937c76" path="/var/lib/kubelet/pods/d954ac44-bc3a-44bf-a690-e248ed937c76/volumes" Mar 18 12:32:58 crc kubenswrapper[4975]: I0318 12:32:58.491749 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zgkvs" podUID="f3553d46-cacf-43e1-886a-44c17ed9a6c5" containerName="ovn-controller" probeResult="failure" output=< Mar 18 12:32:58 crc kubenswrapper[4975]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 12:32:58 crc kubenswrapper[4975]: > Mar 18 12:33:00 crc kubenswrapper[4975]: I0318 12:33:00.637014 4975 generic.go:334] "Generic (PLEG): container finished" podID="c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" containerID="593d8e21a04405a96f7601c6d9e04bc7ca7920d74e754470dec6f28b650c85f9" exitCode=0 Mar 18 12:33:00 crc kubenswrapper[4975]: I0318 12:33:00.637093 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5","Type":"ContainerDied","Data":"593d8e21a04405a96f7601c6d9e04bc7ca7920d74e754470dec6f28b650c85f9"} Mar 18 12:33:00 crc kubenswrapper[4975]: I0318 12:33:00.643607 4975 generic.go:334] "Generic (PLEG): container finished" podID="a68f98b5-0226-4b20-a767-ead5e0af066e" containerID="f117753601c23acff04c84a90faed4c89539cd6b2706e3826b8124d9fc1ce0a0" exitCode=0 Mar 18 12:33:00 crc kubenswrapper[4975]: I0318 12:33:00.643642 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a68f98b5-0226-4b20-a767-ead5e0af066e","Type":"ContainerDied","Data":"f117753601c23acff04c84a90faed4c89539cd6b2706e3826b8124d9fc1ce0a0"} Mar 18 12:33:01 crc kubenswrapper[4975]: I0318 12:33:01.956274 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-sxldc"] Mar 18 12:33:01 crc kubenswrapper[4975]: E0318 12:33:01.956692 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d954ac44-bc3a-44bf-a690-e248ed937c76" containerName="mariadb-account-create-update" Mar 18 12:33:01 crc kubenswrapper[4975]: I0318 12:33:01.956710 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d954ac44-bc3a-44bf-a690-e248ed937c76" containerName="mariadb-account-create-update" Mar 18 12:33:01 crc kubenswrapper[4975]: E0318 12:33:01.956741 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f58451-e968-467f-8d95-7a4c5104ce12" containerName="swift-ring-rebalance" Mar 18 12:33:01 crc kubenswrapper[4975]: I0318 12:33:01.956750 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f58451-e968-467f-8d95-7a4c5104ce12" containerName="swift-ring-rebalance" Mar 18 12:33:01 crc kubenswrapper[4975]: I0318 12:33:01.956937 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f58451-e968-467f-8d95-7a4c5104ce12" containerName="swift-ring-rebalance" Mar 18 12:33:01 crc kubenswrapper[4975]: I0318 12:33:01.956966 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="d954ac44-bc3a-44bf-a690-e248ed937c76" containerName="mariadb-account-create-update" Mar 18 12:33:01 crc kubenswrapper[4975]: I0318 12:33:01.957582 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sxldc" Mar 18 12:33:01 crc kubenswrapper[4975]: I0318 12:33:01.959913 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 12:33:01 crc kubenswrapper[4975]: I0318 12:33:01.971052 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-sxldc"] Mar 18 12:33:02 crc kubenswrapper[4975]: I0318 12:33:02.086679 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lnvh\" (UniqueName: \"kubernetes.io/projected/8cb80719-3311-49dd-9dbf-2d0c40b3d17b-kube-api-access-7lnvh\") pod \"root-account-create-update-sxldc\" (UID: \"8cb80719-3311-49dd-9dbf-2d0c40b3d17b\") " pod="openstack/root-account-create-update-sxldc" Mar 18 12:33:02 crc kubenswrapper[4975]: I0318 12:33:02.087083 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cb80719-3311-49dd-9dbf-2d0c40b3d17b-operator-scripts\") pod \"root-account-create-update-sxldc\" (UID: \"8cb80719-3311-49dd-9dbf-2d0c40b3d17b\") " pod="openstack/root-account-create-update-sxldc" Mar 18 12:33:02 crc kubenswrapper[4975]: I0318 12:33:02.188221 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lnvh\" (UniqueName: \"kubernetes.io/projected/8cb80719-3311-49dd-9dbf-2d0c40b3d17b-kube-api-access-7lnvh\") pod \"root-account-create-update-sxldc\" (UID: \"8cb80719-3311-49dd-9dbf-2d0c40b3d17b\") " pod="openstack/root-account-create-update-sxldc" Mar 18 12:33:02 crc kubenswrapper[4975]: I0318 12:33:02.188294 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cb80719-3311-49dd-9dbf-2d0c40b3d17b-operator-scripts\") pod \"root-account-create-update-sxldc\" (UID: \"8cb80719-3311-49dd-9dbf-2d0c40b3d17b\") " pod="openstack/root-account-create-update-sxldc" Mar 18 12:33:02 crc kubenswrapper[4975]: I0318 12:33:02.189154 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cb80719-3311-49dd-9dbf-2d0c40b3d17b-operator-scripts\") pod \"root-account-create-update-sxldc\" (UID: \"8cb80719-3311-49dd-9dbf-2d0c40b3d17b\") " pod="openstack/root-account-create-update-sxldc" Mar 18 12:33:02 crc kubenswrapper[4975]: I0318 12:33:02.209575 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lnvh\" (UniqueName: \"kubernetes.io/projected/8cb80719-3311-49dd-9dbf-2d0c40b3d17b-kube-api-access-7lnvh\") pod \"root-account-create-update-sxldc\" (UID: \"8cb80719-3311-49dd-9dbf-2d0c40b3d17b\") " pod="openstack/root-account-create-update-sxldc" Mar 18 12:33:02 crc kubenswrapper[4975]: I0318 12:33:02.283229 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sxldc" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.494937 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zgkvs" podUID="f3553d46-cacf-43e1-886a-44c17ed9a6c5" containerName="ovn-controller" probeResult="failure" output=< Mar 18 12:33:03 crc kubenswrapper[4975]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 12:33:03 crc kubenswrapper[4975]: > Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.519329 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.535612 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qpqb7" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.766331 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zgkvs-config-tr6fj"] Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.767368 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.773604 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.798545 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zgkvs-config-tr6fj"] Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.818365 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktrjk\" (UniqueName: \"kubernetes.io/projected/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-kube-api-access-ktrjk\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.818530 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-scripts\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.818918 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-run-ovn\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.819180 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-run\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.819477 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-additional-scripts\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.819524 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-log-ovn\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.921652 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-run-ovn\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.921776 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-run\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.921845 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-additional-scripts\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.921881 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-log-ovn\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.921965 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktrjk\" (UniqueName: \"kubernetes.io/projected/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-kube-api-access-ktrjk\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.922012 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-scripts\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.922262 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-run\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.922268 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-log-ovn\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.922719 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-additional-scripts\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.922951 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-run-ovn\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.925011 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-scripts\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:03 crc kubenswrapper[4975]: I0318 12:33:03.993258 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktrjk\" (UniqueName: \"kubernetes.io/projected/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-kube-api-access-ktrjk\") pod \"ovn-controller-zgkvs-config-tr6fj\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:04 crc kubenswrapper[4975]: I0318 12:33:04.120102 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:07 crc kubenswrapper[4975]: I0318 12:33:07.984368 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-sxldc"] Mar 18 12:33:07 crc kubenswrapper[4975]: I0318 12:33:07.996614 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zgkvs-config-tr6fj"] Mar 18 12:33:08 crc kubenswrapper[4975]: I0318 12:33:08.492393 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-zgkvs" Mar 18 12:33:08 crc kubenswrapper[4975]: I0318 12:33:08.705878 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5","Type":"ContainerStarted","Data":"5e61d8a65d5b622b8755f020d3edfd8731deb2b2bcf19a4b6760afd8bc529b60"} Mar 18 12:33:08 crc kubenswrapper[4975]: I0318 12:33:08.706232 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 12:33:08 crc kubenswrapper[4975]: I0318 12:33:08.708524 4975 generic.go:334] "Generic (PLEG): container finished" podID="8cb80719-3311-49dd-9dbf-2d0c40b3d17b" containerID="a1db9db8ee85e571845d4f26ebe905bbf3c10d184fb10a02baee0eec0365c69a" exitCode=0 Mar 18 12:33:08 crc kubenswrapper[4975]: I0318 12:33:08.708582 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sxldc" event={"ID":"8cb80719-3311-49dd-9dbf-2d0c40b3d17b","Type":"ContainerDied","Data":"a1db9db8ee85e571845d4f26ebe905bbf3c10d184fb10a02baee0eec0365c69a"} Mar 18 12:33:08 crc kubenswrapper[4975]: I0318 12:33:08.708604 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sxldc" event={"ID":"8cb80719-3311-49dd-9dbf-2d0c40b3d17b","Type":"ContainerStarted","Data":"0f2a35e16042df411579243d40110e2b39596e8c7b4c9fdf7bc1faa89f845c2a"} Mar 18 12:33:08 crc kubenswrapper[4975]: I0318 12:33:08.711438 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a68f98b5-0226-4b20-a767-ead5e0af066e","Type":"ContainerStarted","Data":"5f6d0b95c66b9613d5726ca8079d04f6ce79946cd7d41024b803239fe09b32a1"} Mar 18 12:33:08 crc kubenswrapper[4975]: I0318 12:33:08.711659 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:33:08 crc kubenswrapper[4975]: I0318 12:33:08.713811 4975 generic.go:334] "Generic (PLEG): container finished" podID="c03cf0ce-65de-4cad-a3b8-d421f8ac88b3" containerID="ce1114177ebe46aecae8af1d17b80f81d5eaea51273cd2f9feac414d88a4ce2c" exitCode=0 Mar 18 12:33:08 crc kubenswrapper[4975]: I0318 12:33:08.713997 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zgkvs-config-tr6fj" event={"ID":"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3","Type":"ContainerDied","Data":"ce1114177ebe46aecae8af1d17b80f81d5eaea51273cd2f9feac414d88a4ce2c"} Mar 18 12:33:08 crc kubenswrapper[4975]: I0318 12:33:08.714034 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zgkvs-config-tr6fj" event={"ID":"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3","Type":"ContainerStarted","Data":"be915f3bb801e680bd319145810dad6311c3ec433508deb820b59adaa1b0287f"} Mar 18 12:33:08 crc kubenswrapper[4975]: I0318 12:33:08.716724 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2zdjk" event={"ID":"6e796a56-c2ec-40c2-8604-b14e71255013","Type":"ContainerStarted","Data":"b04f1fba8268cb39373ff8520437c05fbbb63aedf49c5e72dd48f279c95e99a3"} Mar 18 12:33:08 crc kubenswrapper[4975]: I0318 12:33:08.736880 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=59.826674271 podStartE2EDuration="1m9.736841934s" podCreationTimestamp="2026-03-18 12:31:59 +0000 UTC" firstStartedPulling="2026-03-18 12:32:15.88307018 +0000 UTC m=+1321.597470759" lastFinishedPulling="2026-03-18 12:32:25.793237843 +0000 UTC m=+1331.507638422" observedRunningTime="2026-03-18 12:33:08.728486854 +0000 UTC m=+1374.442887443" watchObservedRunningTime="2026-03-18 12:33:08.736841934 +0000 UTC m=+1374.451242513" Mar 18 12:33:08 crc kubenswrapper[4975]: I0318 12:33:08.761479 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-2zdjk" podStartSLOduration=2.985816888 podStartE2EDuration="15.761449522s" podCreationTimestamp="2026-03-18 12:32:53 +0000 UTC" firstStartedPulling="2026-03-18 12:32:54.773135999 +0000 UTC m=+1360.487536578" lastFinishedPulling="2026-03-18 12:33:07.548768633 +0000 UTC m=+1373.263169212" observedRunningTime="2026-03-18 12:33:08.752937177 +0000 UTC m=+1374.467337766" watchObservedRunningTime="2026-03-18 12:33:08.761449522 +0000 UTC m=+1374.475850111" Mar 18 12:33:08 crc kubenswrapper[4975]: I0318 12:33:08.820358 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=59.317517181 podStartE2EDuration="1m10.820342794s" podCreationTimestamp="2026-03-18 12:31:58 +0000 UTC" firstStartedPulling="2026-03-18 12:32:14.13660799 +0000 UTC m=+1319.851008569" lastFinishedPulling="2026-03-18 12:32:25.639433603 +0000 UTC m=+1331.353834182" observedRunningTime="2026-03-18 12:33:08.819933082 +0000 UTC m=+1374.534333681" watchObservedRunningTime="2026-03-18 12:33:08.820342794 +0000 UTC m=+1374.534743363" Mar 18 12:33:09 crc kubenswrapper[4975]: I0318 12:33:09.113577 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:33:09 crc kubenswrapper[4975]: I0318 12:33:09.120485 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18526bd6-7184-4e92-8bb6-f85ec1aa3f30-etc-swift\") pod \"swift-storage-0\" (UID: \"18526bd6-7184-4e92-8bb6-f85ec1aa3f30\") " pod="openstack/swift-storage-0" Mar 18 12:33:09 crc kubenswrapper[4975]: I0318 12:33:09.181961 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 12:33:09 crc kubenswrapper[4975]: I0318 12:33:09.897394 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.058513 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sxldc" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.129730 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.134548 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-log-ovn\") pod \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.134593 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lnvh\" (UniqueName: \"kubernetes.io/projected/8cb80719-3311-49dd-9dbf-2d0c40b3d17b-kube-api-access-7lnvh\") pod \"8cb80719-3311-49dd-9dbf-2d0c40b3d17b\" (UID: \"8cb80719-3311-49dd-9dbf-2d0c40b3d17b\") " Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.134689 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-run-ovn\") pod \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.134691 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c03cf0ce-65de-4cad-a3b8-d421f8ac88b3" (UID: "c03cf0ce-65de-4cad-a3b8-d421f8ac88b3"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.134723 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cb80719-3311-49dd-9dbf-2d0c40b3d17b-operator-scripts\") pod \"8cb80719-3311-49dd-9dbf-2d0c40b3d17b\" (UID: \"8cb80719-3311-49dd-9dbf-2d0c40b3d17b\") " Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.134757 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktrjk\" (UniqueName: \"kubernetes.io/projected/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-kube-api-access-ktrjk\") pod \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.134837 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-run\") pod \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.134886 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-scripts\") pod \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.135141 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c03cf0ce-65de-4cad-a3b8-d421f8ac88b3" (UID: "c03cf0ce-65de-4cad-a3b8-d421f8ac88b3"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.135601 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-run" (OuterVolumeSpecName: "var-run") pod "c03cf0ce-65de-4cad-a3b8-d421f8ac88b3" (UID: "c03cf0ce-65de-4cad-a3b8-d421f8ac88b3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.135776 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cb80719-3311-49dd-9dbf-2d0c40b3d17b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8cb80719-3311-49dd-9dbf-2d0c40b3d17b" (UID: "8cb80719-3311-49dd-9dbf-2d0c40b3d17b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.135829 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-additional-scripts\") pod \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\" (UID: \"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3\") " Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.136486 4975 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.137668 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c03cf0ce-65de-4cad-a3b8-d421f8ac88b3" (UID: "c03cf0ce-65de-4cad-a3b8-d421f8ac88b3"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.138328 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-scripts" (OuterVolumeSpecName: "scripts") pod "c03cf0ce-65de-4cad-a3b8-d421f8ac88b3" (UID: "c03cf0ce-65de-4cad-a3b8-d421f8ac88b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.142257 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-kube-api-access-ktrjk" (OuterVolumeSpecName: "kube-api-access-ktrjk") pod "c03cf0ce-65de-4cad-a3b8-d421f8ac88b3" (UID: "c03cf0ce-65de-4cad-a3b8-d421f8ac88b3"). InnerVolumeSpecName "kube-api-access-ktrjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.142300 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cb80719-3311-49dd-9dbf-2d0c40b3d17b-kube-api-access-7lnvh" (OuterVolumeSpecName: "kube-api-access-7lnvh") pod "8cb80719-3311-49dd-9dbf-2d0c40b3d17b" (UID: "8cb80719-3311-49dd-9dbf-2d0c40b3d17b"). InnerVolumeSpecName "kube-api-access-7lnvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.238821 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lnvh\" (UniqueName: \"kubernetes.io/projected/8cb80719-3311-49dd-9dbf-2d0c40b3d17b-kube-api-access-7lnvh\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.238894 4975 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.238907 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cb80719-3311-49dd-9dbf-2d0c40b3d17b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.238917 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktrjk\" (UniqueName: \"kubernetes.io/projected/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-kube-api-access-ktrjk\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.238928 4975 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.238943 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.238982 4975 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.733496 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"0ca159aee5c42ac70f878de3ec387558566607604b05593d496c2c5ba62567ac"} Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.736125 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sxldc" event={"ID":"8cb80719-3311-49dd-9dbf-2d0c40b3d17b","Type":"ContainerDied","Data":"0f2a35e16042df411579243d40110e2b39596e8c7b4c9fdf7bc1faa89f845c2a"} Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.736153 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f2a35e16042df411579243d40110e2b39596e8c7b4c9fdf7bc1faa89f845c2a" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.736209 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sxldc" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.739819 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zgkvs-config-tr6fj" event={"ID":"c03cf0ce-65de-4cad-a3b8-d421f8ac88b3","Type":"ContainerDied","Data":"be915f3bb801e680bd319145810dad6311c3ec433508deb820b59adaa1b0287f"} Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.739878 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be915f3bb801e680bd319145810dad6311c3ec433508deb820b59adaa1b0287f" Mar 18 12:33:10 crc kubenswrapper[4975]: I0318 12:33:10.739947 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zgkvs-config-tr6fj" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.253473 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-zgkvs-config-tr6fj"] Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.259955 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-zgkvs-config-tr6fj"] Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.385041 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zgkvs-config-9h94p"] Mar 18 12:33:11 crc kubenswrapper[4975]: E0318 12:33:11.385378 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03cf0ce-65de-4cad-a3b8-d421f8ac88b3" containerName="ovn-config" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.385395 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03cf0ce-65de-4cad-a3b8-d421f8ac88b3" containerName="ovn-config" Mar 18 12:33:11 crc kubenswrapper[4975]: E0318 12:33:11.385417 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb80719-3311-49dd-9dbf-2d0c40b3d17b" containerName="mariadb-account-create-update" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.385424 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb80719-3311-49dd-9dbf-2d0c40b3d17b" containerName="mariadb-account-create-update" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.385574 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb80719-3311-49dd-9dbf-2d0c40b3d17b" containerName="mariadb-account-create-update" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.385585 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03cf0ce-65de-4cad-a3b8-d421f8ac88b3" containerName="ovn-config" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.386108 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.388283 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.429597 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zgkvs-config-9h94p"] Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.559697 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9vv8\" (UniqueName: \"kubernetes.io/projected/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-kube-api-access-c9vv8\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.559778 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-run\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.559820 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-scripts\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.559889 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-log-ovn\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.559941 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-run-ovn\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.559984 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-additional-scripts\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.661302 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vv8\" (UniqueName: \"kubernetes.io/projected/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-kube-api-access-c9vv8\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.661658 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-run\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.661688 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-scripts\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.661735 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-log-ovn\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.661773 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-run-ovn\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.661817 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-additional-scripts\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.662014 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-run\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.662091 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-run-ovn\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.662102 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-log-ovn\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.662684 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-additional-scripts\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.664154 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-scripts\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.696214 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9vv8\" (UniqueName: \"kubernetes.io/projected/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-kube-api-access-c9vv8\") pod \"ovn-controller-zgkvs-config-9h94p\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.728595 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.750905 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"02d06cbd6a8d0f7b3ff7b678dc8a679e3205b8525b993a5469d8b39e5c23e898"} Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.750962 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"a0122e4b3e501087ab7e56a8bad585c7d84bf68610ce51635d1c0f6671589e57"} Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.750979 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"6fc7059c0961dc150a24711261ceec4f3b0cb88d56b9f637e5b9179adc0393b3"} Mar 18 12:33:11 crc kubenswrapper[4975]: I0318 12:33:11.750992 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"51719899423e2fdffa8f45417ca87526b59a78155d85b5da0c81a58d66ceae1e"} Mar 18 12:33:12 crc kubenswrapper[4975]: I0318 12:33:12.047618 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zgkvs-config-9h94p"] Mar 18 12:33:12 crc kubenswrapper[4975]: I0318 12:33:12.782993 4975 generic.go:334] "Generic (PLEG): container finished" podID="76a0c578-e44a-4cb6-a4a8-69f35dcfdaca" containerID="f9933a6af15b43d91daaeb254728f8dfe22668fead78e658541acd0a5d81bf9b" exitCode=0 Mar 18 12:33:12 crc kubenswrapper[4975]: I0318 12:33:12.783264 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zgkvs-config-9h94p" event={"ID":"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca","Type":"ContainerDied","Data":"f9933a6af15b43d91daaeb254728f8dfe22668fead78e658541acd0a5d81bf9b"} Mar 18 12:33:12 crc kubenswrapper[4975]: I0318 12:33:12.783290 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zgkvs-config-9h94p" event={"ID":"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca","Type":"ContainerStarted","Data":"431f841b9272eafdea895c73e2715e99172d12acb0ede5b08bb622112005c80a"} Mar 18 12:33:13 crc kubenswrapper[4975]: I0318 12:33:13.030497 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03cf0ce-65de-4cad-a3b8-d421f8ac88b3" path="/var/lib/kubelet/pods/c03cf0ce-65de-4cad-a3b8-d421f8ac88b3/volumes" Mar 18 12:33:13 crc kubenswrapper[4975]: I0318 12:33:13.795204 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"774e5aa0c7a4215ccee3142423941055bae9bf037ba39c910cd36478daeb71d3"} Mar 18 12:33:13 crc kubenswrapper[4975]: I0318 12:33:13.795561 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"e5d4db5c915184a14fba88bf639c35c3c6fc12dab23e69996144879e78bad48c"} Mar 18 12:33:13 crc kubenswrapper[4975]: I0318 12:33:13.795577 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"e4db7614078a52e6401712b2cbacb161897a9280026aa4b8662887f15ca1dbb9"} Mar 18 12:33:13 crc kubenswrapper[4975]: I0318 12:33:13.795589 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"7d1cd55374b6c4b482da7a946bac548a117484c65e745de3fe4b8ca4e7ee8279"} Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.080310 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.123469 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-run\") pod \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.123529 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9vv8\" (UniqueName: \"kubernetes.io/projected/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-kube-api-access-c9vv8\") pod \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.123567 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-run-ovn\") pod \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.123603 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-scripts\") pod \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.123635 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-additional-scripts\") pod \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.123630 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-run" (OuterVolumeSpecName: "var-run") pod "76a0c578-e44a-4cb6-a4a8-69f35dcfdaca" (UID: "76a0c578-e44a-4cb6-a4a8-69f35dcfdaca"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.123710 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "76a0c578-e44a-4cb6-a4a8-69f35dcfdaca" (UID: "76a0c578-e44a-4cb6-a4a8-69f35dcfdaca"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.123746 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-log-ovn\") pod \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\" (UID: \"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca\") " Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.124198 4975 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.124211 4975 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.125779 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-scripts" (OuterVolumeSpecName: "scripts") pod "76a0c578-e44a-4cb6-a4a8-69f35dcfdaca" (UID: "76a0c578-e44a-4cb6-a4a8-69f35dcfdaca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.126172 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "76a0c578-e44a-4cb6-a4a8-69f35dcfdaca" (UID: "76a0c578-e44a-4cb6-a4a8-69f35dcfdaca"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.126294 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "76a0c578-e44a-4cb6-a4a8-69f35dcfdaca" (UID: "76a0c578-e44a-4cb6-a4a8-69f35dcfdaca"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.128775 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-kube-api-access-c9vv8" (OuterVolumeSpecName: "kube-api-access-c9vv8") pod "76a0c578-e44a-4cb6-a4a8-69f35dcfdaca" (UID: "76a0c578-e44a-4cb6-a4a8-69f35dcfdaca"). InnerVolumeSpecName "kube-api-access-c9vv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.226217 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9vv8\" (UniqueName: \"kubernetes.io/projected/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-kube-api-access-c9vv8\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.226255 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.226268 4975 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.226281 4975 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.813494 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zgkvs-config-9h94p" event={"ID":"76a0c578-e44a-4cb6-a4a8-69f35dcfdaca","Type":"ContainerDied","Data":"431f841b9272eafdea895c73e2715e99172d12acb0ede5b08bb622112005c80a"} Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.813548 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="431f841b9272eafdea895c73e2715e99172d12acb0ede5b08bb622112005c80a" Mar 18 12:33:14 crc kubenswrapper[4975]: I0318 12:33:14.813632 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zgkvs-config-9h94p" Mar 18 12:33:15 crc kubenswrapper[4975]: I0318 12:33:15.182387 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-zgkvs-config-9h94p"] Mar 18 12:33:15 crc kubenswrapper[4975]: I0318 12:33:15.192348 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-zgkvs-config-9h94p"] Mar 18 12:33:15 crc kubenswrapper[4975]: I0318 12:33:15.825093 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"8c483a339294ba0db14ed2b613bb5900c85cca4c2129afabc7f344246859822f"} Mar 18 12:33:15 crc kubenswrapper[4975]: I0318 12:33:15.826425 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"3cb3c2dbbf46e63d62fcd106bd0df90fcb100abf2ce46c96f6fcd28cb2ef6313"} Mar 18 12:33:15 crc kubenswrapper[4975]: I0318 12:33:15.826511 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"95f97d0a057d4453cff9ab00a3b4dd944c980de1c1e04be764be4ca4b31680af"} Mar 18 12:33:15 crc kubenswrapper[4975]: I0318 12:33:15.826605 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"ba53af4dc12bf6282f43a2a33930852e47aea9df5be4bf882d41e727d52141a4"} Mar 18 12:33:16 crc kubenswrapper[4975]: I0318 12:33:16.834074 4975 generic.go:334] "Generic (PLEG): container finished" podID="6e796a56-c2ec-40c2-8604-b14e71255013" containerID="b04f1fba8268cb39373ff8520437c05fbbb63aedf49c5e72dd48f279c95e99a3" exitCode=0 Mar 18 12:33:16 crc kubenswrapper[4975]: I0318 12:33:16.834162 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2zdjk" event={"ID":"6e796a56-c2ec-40c2-8604-b14e71255013","Type":"ContainerDied","Data":"b04f1fba8268cb39373ff8520437c05fbbb63aedf49c5e72dd48f279c95e99a3"} Mar 18 12:33:16 crc kubenswrapper[4975]: I0318 12:33:16.840557 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"74b72217240f4613492062e75c55fdc64ca55de5b90536dd8de7e95326e04a8c"} Mar 18 12:33:16 crc kubenswrapper[4975]: I0318 12:33:16.840603 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"14c2610caa0f9e44fc0c01627eb9521d2c68d5ba53f218187deda214380a1241"} Mar 18 12:33:16 crc kubenswrapper[4975]: I0318 12:33:16.840626 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"18526bd6-7184-4e92-8bb6-f85ec1aa3f30","Type":"ContainerStarted","Data":"d1493cef604283ed64722d536faf26ee9d67e210fda9f091804ead622bef7266"} Mar 18 12:33:16 crc kubenswrapper[4975]: I0318 12:33:16.890190 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.067473898 podStartE2EDuration="41.890174194s" podCreationTimestamp="2026-03-18 12:32:35 +0000 UTC" firstStartedPulling="2026-03-18 12:33:09.923337513 +0000 UTC m=+1375.637738092" lastFinishedPulling="2026-03-18 12:33:14.746037809 +0000 UTC m=+1380.460438388" observedRunningTime="2026-03-18 12:33:16.887934532 +0000 UTC m=+1382.602335131" watchObservedRunningTime="2026-03-18 12:33:16.890174194 +0000 UTC m=+1382.604574773" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.025389 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a0c578-e44a-4cb6-a4a8-69f35dcfdaca" path="/var/lib/kubelet/pods/76a0c578-e44a-4cb6-a4a8-69f35dcfdaca/volumes" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.172029 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-7hqbr"] Mar 18 12:33:17 crc kubenswrapper[4975]: E0318 12:33:17.172387 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a0c578-e44a-4cb6-a4a8-69f35dcfdaca" containerName="ovn-config" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.172411 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a0c578-e44a-4cb6-a4a8-69f35dcfdaca" containerName="ovn-config" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.172606 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a0c578-e44a-4cb6-a4a8-69f35dcfdaca" containerName="ovn-config" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.173762 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.190541 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.222043 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-7hqbr"] Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.276798 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.276855 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-config\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.276973 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-dns-svc\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.276994 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkl2q\" (UniqueName: \"kubernetes.io/projected/f9b62c54-4223-4370-bd91-e5a23eb874cf-kube-api-access-rkl2q\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.277013 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.277143 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.378576 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.379011 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.379230 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-config\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.379429 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-dns-svc\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.379580 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkl2q\" (UniqueName: \"kubernetes.io/projected/f9b62c54-4223-4370-bd91-e5a23eb874cf-kube-api-access-rkl2q\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.379723 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.379892 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.380531 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-config\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.380679 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-dns-svc\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.380934 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.382088 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.405504 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkl2q\" (UniqueName: \"kubernetes.io/projected/f9b62c54-4223-4370-bd91-e5a23eb874cf-kube-api-access-rkl2q\") pod \"dnsmasq-dns-764c5664d7-7hqbr\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.490757 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:17 crc kubenswrapper[4975]: I0318 12:33:17.949643 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-7hqbr"] Mar 18 12:33:17 crc kubenswrapper[4975]: W0318 12:33:17.953570 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9b62c54_4223_4370_bd91_e5a23eb874cf.slice/crio-c42492d1a7566498fe2d0c3da23f3a8ce4efb2350bc11ac3a47a4db26486bcc3 WatchSource:0}: Error finding container c42492d1a7566498fe2d0c3da23f3a8ce4efb2350bc11ac3a47a4db26486bcc3: Status 404 returned error can't find the container with id c42492d1a7566498fe2d0c3da23f3a8ce4efb2350bc11ac3a47a4db26486bcc3 Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.147453 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2zdjk" Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.200771 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9slb\" (UniqueName: \"kubernetes.io/projected/6e796a56-c2ec-40c2-8604-b14e71255013-kube-api-access-d9slb\") pod \"6e796a56-c2ec-40c2-8604-b14e71255013\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.201013 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-db-sync-config-data\") pod \"6e796a56-c2ec-40c2-8604-b14e71255013\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.201855 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-combined-ca-bundle\") pod \"6e796a56-c2ec-40c2-8604-b14e71255013\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.201962 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-config-data\") pod \"6e796a56-c2ec-40c2-8604-b14e71255013\" (UID: \"6e796a56-c2ec-40c2-8604-b14e71255013\") " Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.205657 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6e796a56-c2ec-40c2-8604-b14e71255013" (UID: "6e796a56-c2ec-40c2-8604-b14e71255013"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.205812 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e796a56-c2ec-40c2-8604-b14e71255013-kube-api-access-d9slb" (OuterVolumeSpecName: "kube-api-access-d9slb") pod "6e796a56-c2ec-40c2-8604-b14e71255013" (UID: "6e796a56-c2ec-40c2-8604-b14e71255013"). InnerVolumeSpecName "kube-api-access-d9slb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.232267 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e796a56-c2ec-40c2-8604-b14e71255013" (UID: "6e796a56-c2ec-40c2-8604-b14e71255013"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.248766 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-config-data" (OuterVolumeSpecName: "config-data") pod "6e796a56-c2ec-40c2-8604-b14e71255013" (UID: "6e796a56-c2ec-40c2-8604-b14e71255013"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.304312 4975 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.304343 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.304352 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e796a56-c2ec-40c2-8604-b14e71255013-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.304363 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9slb\" (UniqueName: \"kubernetes.io/projected/6e796a56-c2ec-40c2-8604-b14e71255013-kube-api-access-d9slb\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.858352 4975 generic.go:334] "Generic (PLEG): container finished" podID="f9b62c54-4223-4370-bd91-e5a23eb874cf" containerID="c0e3b10b80292a52da162f3054a1581450f685483611b71d17cdc654378a0b2a" exitCode=0 Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.858412 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" event={"ID":"f9b62c54-4223-4370-bd91-e5a23eb874cf","Type":"ContainerDied","Data":"c0e3b10b80292a52da162f3054a1581450f685483611b71d17cdc654378a0b2a"} Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.859056 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" event={"ID":"f9b62c54-4223-4370-bd91-e5a23eb874cf","Type":"ContainerStarted","Data":"c42492d1a7566498fe2d0c3da23f3a8ce4efb2350bc11ac3a47a4db26486bcc3"} Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.860902 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2zdjk" event={"ID":"6e796a56-c2ec-40c2-8604-b14e71255013","Type":"ContainerDied","Data":"a3de3462bfb98efca7c24b1be88995b1c4e9230fee9780cd764c65585db9c18f"} Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.860945 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3de3462bfb98efca7c24b1be88995b1c4e9230fee9780cd764c65585db9c18f" Mar 18 12:33:18 crc kubenswrapper[4975]: I0318 12:33:18.860999 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2zdjk" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.229925 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-7hqbr"] Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.284926 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-8wr26"] Mar 18 12:33:19 crc kubenswrapper[4975]: E0318 12:33:19.285732 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e796a56-c2ec-40c2-8604-b14e71255013" containerName="glance-db-sync" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.285799 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e796a56-c2ec-40c2-8604-b14e71255013" containerName="glance-db-sync" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.286057 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e796a56-c2ec-40c2-8604-b14e71255013" containerName="glance-db-sync" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.286923 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.296637 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-8wr26"] Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.355030 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-config\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.355097 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.355316 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.355408 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.355469 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.355571 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-687lg\" (UniqueName: \"kubernetes.io/projected/42860e95-d4ae-438b-a858-a88e69058574-kube-api-access-687lg\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.456849 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-687lg\" (UniqueName: \"kubernetes.io/projected/42860e95-d4ae-438b-a858-a88e69058574-kube-api-access-687lg\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.456975 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-config\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.457011 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.457054 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.457074 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.457093 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.457950 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.458106 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.458240 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.458308 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-config\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.458687 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.472725 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-687lg\" (UniqueName: \"kubernetes.io/projected/42860e95-d4ae-438b-a858-a88e69058574-kube-api-access-687lg\") pod \"dnsmasq-dns-74f6bcbc87-8wr26\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.603745 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.869129 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" event={"ID":"f9b62c54-4223-4370-bd91-e5a23eb874cf","Type":"ContainerStarted","Data":"ab87c56751c89be4c7872f1cbb2843537601f01a170ecbc0b25b8ec974ec1fc1"} Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.869404 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:19 crc kubenswrapper[4975]: I0318 12:33:19.889163 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" podStartSLOduration=2.889140201 podStartE2EDuration="2.889140201s" podCreationTimestamp="2026-03-18 12:33:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:19.886343804 +0000 UTC m=+1385.600744383" watchObservedRunningTime="2026-03-18 12:33:19.889140201 +0000 UTC m=+1385.603540780" Mar 18 12:33:20 crc kubenswrapper[4975]: I0318 12:33:20.006606 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-8wr26"] Mar 18 12:33:20 crc kubenswrapper[4975]: W0318 12:33:20.014846 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42860e95_d4ae_438b_a858_a88e69058574.slice/crio-493e45eea83cd75bf3afb807e619c71f7d80307f6ffd452c8d71fdc7c4f675bf WatchSource:0}: Error finding container 493e45eea83cd75bf3afb807e619c71f7d80307f6ffd452c8d71fdc7c4f675bf: Status 404 returned error can't find the container with id 493e45eea83cd75bf3afb807e619c71f7d80307f6ffd452c8d71fdc7c4f675bf Mar 18 12:33:20 crc kubenswrapper[4975]: I0318 12:33:20.321019 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:33:20 crc kubenswrapper[4975]: I0318 12:33:20.783126 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 12:33:20 crc kubenswrapper[4975]: I0318 12:33:20.879456 4975 generic.go:334] "Generic (PLEG): container finished" podID="42860e95-d4ae-438b-a858-a88e69058574" containerID="834759c41c13b167fbfb55bdc13c1d44699516e10324abbee15606cb67beb04d" exitCode=0 Mar 18 12:33:20 crc kubenswrapper[4975]: I0318 12:33:20.879671 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" podUID="f9b62c54-4223-4370-bd91-e5a23eb874cf" containerName="dnsmasq-dns" containerID="cri-o://ab87c56751c89be4c7872f1cbb2843537601f01a170ecbc0b25b8ec974ec1fc1" gracePeriod=10 Mar 18 12:33:20 crc kubenswrapper[4975]: I0318 12:33:20.881437 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" event={"ID":"42860e95-d4ae-438b-a858-a88e69058574","Type":"ContainerDied","Data":"834759c41c13b167fbfb55bdc13c1d44699516e10324abbee15606cb67beb04d"} Mar 18 12:33:20 crc kubenswrapper[4975]: I0318 12:33:20.881479 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" event={"ID":"42860e95-d4ae-438b-a858-a88e69058574","Type":"ContainerStarted","Data":"493e45eea83cd75bf3afb807e619c71f7d80307f6ffd452c8d71fdc7c4f675bf"} Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.224252 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.311523 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-dns-svc\") pod \"f9b62c54-4223-4370-bd91-e5a23eb874cf\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.311569 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkl2q\" (UniqueName: \"kubernetes.io/projected/f9b62c54-4223-4370-bd91-e5a23eb874cf-kube-api-access-rkl2q\") pod \"f9b62c54-4223-4370-bd91-e5a23eb874cf\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.311606 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-ovsdbserver-sb\") pod \"f9b62c54-4223-4370-bd91-e5a23eb874cf\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.311647 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-config\") pod \"f9b62c54-4223-4370-bd91-e5a23eb874cf\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.311714 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-dns-swift-storage-0\") pod \"f9b62c54-4223-4370-bd91-e5a23eb874cf\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.311805 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-ovsdbserver-nb\") pod \"f9b62c54-4223-4370-bd91-e5a23eb874cf\" (UID: \"f9b62c54-4223-4370-bd91-e5a23eb874cf\") " Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.334536 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b62c54-4223-4370-bd91-e5a23eb874cf-kube-api-access-rkl2q" (OuterVolumeSpecName: "kube-api-access-rkl2q") pod "f9b62c54-4223-4370-bd91-e5a23eb874cf" (UID: "f9b62c54-4223-4370-bd91-e5a23eb874cf"). InnerVolumeSpecName "kube-api-access-rkl2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.360152 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9b62c54-4223-4370-bd91-e5a23eb874cf" (UID: "f9b62c54-4223-4370-bd91-e5a23eb874cf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.360431 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-config" (OuterVolumeSpecName: "config") pod "f9b62c54-4223-4370-bd91-e5a23eb874cf" (UID: "f9b62c54-4223-4370-bd91-e5a23eb874cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.360997 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9b62c54-4223-4370-bd91-e5a23eb874cf" (UID: "f9b62c54-4223-4370-bd91-e5a23eb874cf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.363204 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9b62c54-4223-4370-bd91-e5a23eb874cf" (UID: "f9b62c54-4223-4370-bd91-e5a23eb874cf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.379217 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f9b62c54-4223-4370-bd91-e5a23eb874cf" (UID: "f9b62c54-4223-4370-bd91-e5a23eb874cf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.413287 4975 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.413329 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.413341 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.413353 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkl2q\" (UniqueName: \"kubernetes.io/projected/f9b62c54-4223-4370-bd91-e5a23eb874cf-kube-api-access-rkl2q\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.413366 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.413377 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b62c54-4223-4370-bd91-e5a23eb874cf-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.888964 4975 generic.go:334] "Generic (PLEG): container finished" podID="f9b62c54-4223-4370-bd91-e5a23eb874cf" containerID="ab87c56751c89be4c7872f1cbb2843537601f01a170ecbc0b25b8ec974ec1fc1" exitCode=0 Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.889509 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.889532 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" event={"ID":"f9b62c54-4223-4370-bd91-e5a23eb874cf","Type":"ContainerDied","Data":"ab87c56751c89be4c7872f1cbb2843537601f01a170ecbc0b25b8ec974ec1fc1"} Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.889803 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-7hqbr" event={"ID":"f9b62c54-4223-4370-bd91-e5a23eb874cf","Type":"ContainerDied","Data":"c42492d1a7566498fe2d0c3da23f3a8ce4efb2350bc11ac3a47a4db26486bcc3"} Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.889841 4975 scope.go:117] "RemoveContainer" containerID="ab87c56751c89be4c7872f1cbb2843537601f01a170ecbc0b25b8ec974ec1fc1" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.891109 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" event={"ID":"42860e95-d4ae-438b-a858-a88e69058574","Type":"ContainerStarted","Data":"3520ffb61ef42e02db934b0e54b706f21081f7d640fc5d002d78e673365101bf"} Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.891417 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.918379 4975 scope.go:117] "RemoveContainer" containerID="c0e3b10b80292a52da162f3054a1581450f685483611b71d17cdc654378a0b2a" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.943158 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" podStartSLOduration=2.943130852 podStartE2EDuration="2.943130852s" podCreationTimestamp="2026-03-18 12:33:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:21.918643628 +0000 UTC m=+1387.633044227" watchObservedRunningTime="2026-03-18 12:33:21.943130852 +0000 UTC m=+1387.657531431" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.945350 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-7hqbr"] Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.949206 4975 scope.go:117] "RemoveContainer" containerID="ab87c56751c89be4c7872f1cbb2843537601f01a170ecbc0b25b8ec974ec1fc1" Mar 18 12:33:21 crc kubenswrapper[4975]: E0318 12:33:21.949653 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab87c56751c89be4c7872f1cbb2843537601f01a170ecbc0b25b8ec974ec1fc1\": container with ID starting with ab87c56751c89be4c7872f1cbb2843537601f01a170ecbc0b25b8ec974ec1fc1 not found: ID does not exist" containerID="ab87c56751c89be4c7872f1cbb2843537601f01a170ecbc0b25b8ec974ec1fc1" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.949688 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab87c56751c89be4c7872f1cbb2843537601f01a170ecbc0b25b8ec974ec1fc1"} err="failed to get container status \"ab87c56751c89be4c7872f1cbb2843537601f01a170ecbc0b25b8ec974ec1fc1\": rpc error: code = NotFound desc = could not find container \"ab87c56751c89be4c7872f1cbb2843537601f01a170ecbc0b25b8ec974ec1fc1\": container with ID starting with ab87c56751c89be4c7872f1cbb2843537601f01a170ecbc0b25b8ec974ec1fc1 not found: ID does not exist" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.949712 4975 scope.go:117] "RemoveContainer" containerID="c0e3b10b80292a52da162f3054a1581450f685483611b71d17cdc654378a0b2a" Mar 18 12:33:21 crc kubenswrapper[4975]: E0318 12:33:21.949975 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e3b10b80292a52da162f3054a1581450f685483611b71d17cdc654378a0b2a\": container with ID starting with c0e3b10b80292a52da162f3054a1581450f685483611b71d17cdc654378a0b2a not found: ID does not exist" containerID="c0e3b10b80292a52da162f3054a1581450f685483611b71d17cdc654378a0b2a" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.950002 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e3b10b80292a52da162f3054a1581450f685483611b71d17cdc654378a0b2a"} err="failed to get container status \"c0e3b10b80292a52da162f3054a1581450f685483611b71d17cdc654378a0b2a\": rpc error: code = NotFound desc = could not find container \"c0e3b10b80292a52da162f3054a1581450f685483611b71d17cdc654378a0b2a\": container with ID starting with c0e3b10b80292a52da162f3054a1581450f685483611b71d17cdc654378a0b2a not found: ID does not exist" Mar 18 12:33:21 crc kubenswrapper[4975]: I0318 12:33:21.952137 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-7hqbr"] Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.225068 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-gkcrs"] Mar 18 12:33:22 crc kubenswrapper[4975]: E0318 12:33:22.225523 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b62c54-4223-4370-bd91-e5a23eb874cf" containerName="init" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.225546 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b62c54-4223-4370-bd91-e5a23eb874cf" containerName="init" Mar 18 12:33:22 crc kubenswrapper[4975]: E0318 12:33:22.225583 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b62c54-4223-4370-bd91-e5a23eb874cf" containerName="dnsmasq-dns" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.225593 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b62c54-4223-4370-bd91-e5a23eb874cf" containerName="dnsmasq-dns" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.225773 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9b62c54-4223-4370-bd91-e5a23eb874cf" containerName="dnsmasq-dns" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.226397 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gkcrs" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.242381 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gkcrs"] Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.326934 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c17426b-2e12-4e49-aaee-cb9d0f438552-operator-scripts\") pod \"cinder-db-create-gkcrs\" (UID: \"0c17426b-2e12-4e49-aaee-cb9d0f438552\") " pod="openstack/cinder-db-create-gkcrs" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.327239 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46zjc\" (UniqueName: \"kubernetes.io/projected/0c17426b-2e12-4e49-aaee-cb9d0f438552-kube-api-access-46zjc\") pod \"cinder-db-create-gkcrs\" (UID: \"0c17426b-2e12-4e49-aaee-cb9d0f438552\") " pod="openstack/cinder-db-create-gkcrs" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.341776 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-647b-account-create-update-d6jph"] Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.343825 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-647b-account-create-update-d6jph" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.349447 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.350786 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-647b-account-create-update-d6jph"] Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.428636 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c17426b-2e12-4e49-aaee-cb9d0f438552-operator-scripts\") pod \"cinder-db-create-gkcrs\" (UID: \"0c17426b-2e12-4e49-aaee-cb9d0f438552\") " pod="openstack/cinder-db-create-gkcrs" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.428678 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46zjc\" (UniqueName: \"kubernetes.io/projected/0c17426b-2e12-4e49-aaee-cb9d0f438552-kube-api-access-46zjc\") pod \"cinder-db-create-gkcrs\" (UID: \"0c17426b-2e12-4e49-aaee-cb9d0f438552\") " pod="openstack/cinder-db-create-gkcrs" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.428722 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72df1add-3fec-4e3b-baad-90ba808298f7-operator-scripts\") pod \"cinder-647b-account-create-update-d6jph\" (UID: \"72df1add-3fec-4e3b-baad-90ba808298f7\") " pod="openstack/cinder-647b-account-create-update-d6jph" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.428766 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm8pn\" (UniqueName: \"kubernetes.io/projected/72df1add-3fec-4e3b-baad-90ba808298f7-kube-api-access-rm8pn\") pod \"cinder-647b-account-create-update-d6jph\" (UID: \"72df1add-3fec-4e3b-baad-90ba808298f7\") " pod="openstack/cinder-647b-account-create-update-d6jph" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.429617 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c17426b-2e12-4e49-aaee-cb9d0f438552-operator-scripts\") pod \"cinder-db-create-gkcrs\" (UID: \"0c17426b-2e12-4e49-aaee-cb9d0f438552\") " pod="openstack/cinder-db-create-gkcrs" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.463570 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46zjc\" (UniqueName: \"kubernetes.io/projected/0c17426b-2e12-4e49-aaee-cb9d0f438552-kube-api-access-46zjc\") pod \"cinder-db-create-gkcrs\" (UID: \"0c17426b-2e12-4e49-aaee-cb9d0f438552\") " pod="openstack/cinder-db-create-gkcrs" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.528436 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-vp6r7"] Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.529946 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vp6r7" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.530608 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm8pn\" (UniqueName: \"kubernetes.io/projected/72df1add-3fec-4e3b-baad-90ba808298f7-kube-api-access-rm8pn\") pod \"cinder-647b-account-create-update-d6jph\" (UID: \"72df1add-3fec-4e3b-baad-90ba808298f7\") " pod="openstack/cinder-647b-account-create-update-d6jph" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.530760 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72df1add-3fec-4e3b-baad-90ba808298f7-operator-scripts\") pod \"cinder-647b-account-create-update-d6jph\" (UID: \"72df1add-3fec-4e3b-baad-90ba808298f7\") " pod="openstack/cinder-647b-account-create-update-d6jph" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.531563 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72df1add-3fec-4e3b-baad-90ba808298f7-operator-scripts\") pod \"cinder-647b-account-create-update-d6jph\" (UID: \"72df1add-3fec-4e3b-baad-90ba808298f7\") " pod="openstack/cinder-647b-account-create-update-d6jph" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.543199 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d456-account-create-update-htdrw"] Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.543740 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gkcrs" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.544159 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d456-account-create-update-htdrw" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.546678 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.557563 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vp6r7"] Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.591961 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d456-account-create-update-htdrw"] Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.594140 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm8pn\" (UniqueName: \"kubernetes.io/projected/72df1add-3fec-4e3b-baad-90ba808298f7-kube-api-access-rm8pn\") pod \"cinder-647b-account-create-update-d6jph\" (UID: \"72df1add-3fec-4e3b-baad-90ba808298f7\") " pod="openstack/cinder-647b-account-create-update-d6jph" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.634956 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fn5b\" (UniqueName: \"kubernetes.io/projected/91116f39-0e09-4921-8ee3-aaff9a89b610-kube-api-access-6fn5b\") pod \"barbican-d456-account-create-update-htdrw\" (UID: \"91116f39-0e09-4921-8ee3-aaff9a89b610\") " pod="openstack/barbican-d456-account-create-update-htdrw" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.635040 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac2a33aa-f2b3-4da7-9265-cc73a33649d3-operator-scripts\") pod \"barbican-db-create-vp6r7\" (UID: \"ac2a33aa-f2b3-4da7-9265-cc73a33649d3\") " pod="openstack/barbican-db-create-vp6r7" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.635091 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxfpf\" (UniqueName: \"kubernetes.io/projected/ac2a33aa-f2b3-4da7-9265-cc73a33649d3-kube-api-access-dxfpf\") pod \"barbican-db-create-vp6r7\" (UID: \"ac2a33aa-f2b3-4da7-9265-cc73a33649d3\") " pod="openstack/barbican-db-create-vp6r7" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.635122 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91116f39-0e09-4921-8ee3-aaff9a89b610-operator-scripts\") pod \"barbican-d456-account-create-update-htdrw\" (UID: \"91116f39-0e09-4921-8ee3-aaff9a89b610\") " pod="openstack/barbican-d456-account-create-update-htdrw" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.671317 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-647b-account-create-update-d6jph" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.695522 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-gjb7f"] Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.697735 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gjb7f" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.717937 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-8mhpn"] Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.719185 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8mhpn" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.723348 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.723564 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d6c9r" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.723843 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.724087 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.739473 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fn5b\" (UniqueName: \"kubernetes.io/projected/91116f39-0e09-4921-8ee3-aaff9a89b610-kube-api-access-6fn5b\") pod \"barbican-d456-account-create-update-htdrw\" (UID: \"91116f39-0e09-4921-8ee3-aaff9a89b610\") " pod="openstack/barbican-d456-account-create-update-htdrw" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.739837 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac2a33aa-f2b3-4da7-9265-cc73a33649d3-operator-scripts\") pod \"barbican-db-create-vp6r7\" (UID: \"ac2a33aa-f2b3-4da7-9265-cc73a33649d3\") " pod="openstack/barbican-db-create-vp6r7" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.740344 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxfpf\" (UniqueName: \"kubernetes.io/projected/ac2a33aa-f2b3-4da7-9265-cc73a33649d3-kube-api-access-dxfpf\") pod \"barbican-db-create-vp6r7\" (UID: \"ac2a33aa-f2b3-4da7-9265-cc73a33649d3\") " pod="openstack/barbican-db-create-vp6r7" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.740676 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91116f39-0e09-4921-8ee3-aaff9a89b610-operator-scripts\") pod \"barbican-d456-account-create-update-htdrw\" (UID: \"91116f39-0e09-4921-8ee3-aaff9a89b610\") " pod="openstack/barbican-d456-account-create-update-htdrw" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.741673 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91116f39-0e09-4921-8ee3-aaff9a89b610-operator-scripts\") pod \"barbican-d456-account-create-update-htdrw\" (UID: \"91116f39-0e09-4921-8ee3-aaff9a89b610\") " pod="openstack/barbican-d456-account-create-update-htdrw" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.745150 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac2a33aa-f2b3-4da7-9265-cc73a33649d3-operator-scripts\") pod \"barbican-db-create-vp6r7\" (UID: \"ac2a33aa-f2b3-4da7-9265-cc73a33649d3\") " pod="openstack/barbican-db-create-vp6r7" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.749163 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8mhpn"] Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.760491 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fn5b\" (UniqueName: \"kubernetes.io/projected/91116f39-0e09-4921-8ee3-aaff9a89b610-kube-api-access-6fn5b\") pod \"barbican-d456-account-create-update-htdrw\" (UID: \"91116f39-0e09-4921-8ee3-aaff9a89b610\") " pod="openstack/barbican-d456-account-create-update-htdrw" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.763039 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gjb7f"] Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.769061 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxfpf\" (UniqueName: \"kubernetes.io/projected/ac2a33aa-f2b3-4da7-9265-cc73a33649d3-kube-api-access-dxfpf\") pod \"barbican-db-create-vp6r7\" (UID: \"ac2a33aa-f2b3-4da7-9265-cc73a33649d3\") " pod="openstack/barbican-db-create-vp6r7" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.777401 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-2d70-account-create-update-mt7tw"] Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.778601 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2d70-account-create-update-mt7tw" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.782981 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.795878 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2d70-account-create-update-mt7tw"] Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.843220 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tcsd\" (UniqueName: \"kubernetes.io/projected/a00825d4-1eac-45e1-9eec-bdc12eb450a2-kube-api-access-9tcsd\") pod \"neutron-db-create-gjb7f\" (UID: \"a00825d4-1eac-45e1-9eec-bdc12eb450a2\") " pod="openstack/neutron-db-create-gjb7f" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.843270 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b50774ab-6273-44c8-ae0c-d6c77c3996fc-config-data\") pod \"keystone-db-sync-8mhpn\" (UID: \"b50774ab-6273-44c8-ae0c-d6c77c3996fc\") " pod="openstack/keystone-db-sync-8mhpn" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.843358 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x296n\" (UniqueName: \"kubernetes.io/projected/6bc880e4-546b-4dce-a6eb-39aef7dd9954-kube-api-access-x296n\") pod \"neutron-2d70-account-create-update-mt7tw\" (UID: \"6bc880e4-546b-4dce-a6eb-39aef7dd9954\") " pod="openstack/neutron-2d70-account-create-update-mt7tw" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.843400 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bc880e4-546b-4dce-a6eb-39aef7dd9954-operator-scripts\") pod \"neutron-2d70-account-create-update-mt7tw\" (UID: \"6bc880e4-546b-4dce-a6eb-39aef7dd9954\") " pod="openstack/neutron-2d70-account-create-update-mt7tw" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.843489 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlrxt\" (UniqueName: \"kubernetes.io/projected/b50774ab-6273-44c8-ae0c-d6c77c3996fc-kube-api-access-dlrxt\") pod \"keystone-db-sync-8mhpn\" (UID: \"b50774ab-6273-44c8-ae0c-d6c77c3996fc\") " pod="openstack/keystone-db-sync-8mhpn" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.843534 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b50774ab-6273-44c8-ae0c-d6c77c3996fc-combined-ca-bundle\") pod \"keystone-db-sync-8mhpn\" (UID: \"b50774ab-6273-44c8-ae0c-d6c77c3996fc\") " pod="openstack/keystone-db-sync-8mhpn" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.843580 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a00825d4-1eac-45e1-9eec-bdc12eb450a2-operator-scripts\") pod \"neutron-db-create-gjb7f\" (UID: \"a00825d4-1eac-45e1-9eec-bdc12eb450a2\") " pod="openstack/neutron-db-create-gjb7f" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.944662 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b50774ab-6273-44c8-ae0c-d6c77c3996fc-combined-ca-bundle\") pod \"keystone-db-sync-8mhpn\" (UID: \"b50774ab-6273-44c8-ae0c-d6c77c3996fc\") " pod="openstack/keystone-db-sync-8mhpn" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.944755 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a00825d4-1eac-45e1-9eec-bdc12eb450a2-operator-scripts\") pod \"neutron-db-create-gjb7f\" (UID: \"a00825d4-1eac-45e1-9eec-bdc12eb450a2\") " pod="openstack/neutron-db-create-gjb7f" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.944805 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tcsd\" (UniqueName: \"kubernetes.io/projected/a00825d4-1eac-45e1-9eec-bdc12eb450a2-kube-api-access-9tcsd\") pod \"neutron-db-create-gjb7f\" (UID: \"a00825d4-1eac-45e1-9eec-bdc12eb450a2\") " pod="openstack/neutron-db-create-gjb7f" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.944835 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b50774ab-6273-44c8-ae0c-d6c77c3996fc-config-data\") pod \"keystone-db-sync-8mhpn\" (UID: \"b50774ab-6273-44c8-ae0c-d6c77c3996fc\") " pod="openstack/keystone-db-sync-8mhpn" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.944906 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x296n\" (UniqueName: \"kubernetes.io/projected/6bc880e4-546b-4dce-a6eb-39aef7dd9954-kube-api-access-x296n\") pod \"neutron-2d70-account-create-update-mt7tw\" (UID: \"6bc880e4-546b-4dce-a6eb-39aef7dd9954\") " pod="openstack/neutron-2d70-account-create-update-mt7tw" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.944948 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bc880e4-546b-4dce-a6eb-39aef7dd9954-operator-scripts\") pod \"neutron-2d70-account-create-update-mt7tw\" (UID: \"6bc880e4-546b-4dce-a6eb-39aef7dd9954\") " pod="openstack/neutron-2d70-account-create-update-mt7tw" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.945000 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlrxt\" (UniqueName: \"kubernetes.io/projected/b50774ab-6273-44c8-ae0c-d6c77c3996fc-kube-api-access-dlrxt\") pod \"keystone-db-sync-8mhpn\" (UID: \"b50774ab-6273-44c8-ae0c-d6c77c3996fc\") " pod="openstack/keystone-db-sync-8mhpn" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.946896 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bc880e4-546b-4dce-a6eb-39aef7dd9954-operator-scripts\") pod \"neutron-2d70-account-create-update-mt7tw\" (UID: \"6bc880e4-546b-4dce-a6eb-39aef7dd9954\") " pod="openstack/neutron-2d70-account-create-update-mt7tw" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.947471 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a00825d4-1eac-45e1-9eec-bdc12eb450a2-operator-scripts\") pod \"neutron-db-create-gjb7f\" (UID: \"a00825d4-1eac-45e1-9eec-bdc12eb450a2\") " pod="openstack/neutron-db-create-gjb7f" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.950587 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b50774ab-6273-44c8-ae0c-d6c77c3996fc-config-data\") pod \"keystone-db-sync-8mhpn\" (UID: \"b50774ab-6273-44c8-ae0c-d6c77c3996fc\") " pod="openstack/keystone-db-sync-8mhpn" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.951151 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b50774ab-6273-44c8-ae0c-d6c77c3996fc-combined-ca-bundle\") pod \"keystone-db-sync-8mhpn\" (UID: \"b50774ab-6273-44c8-ae0c-d6c77c3996fc\") " pod="openstack/keystone-db-sync-8mhpn" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.968261 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x296n\" (UniqueName: \"kubernetes.io/projected/6bc880e4-546b-4dce-a6eb-39aef7dd9954-kube-api-access-x296n\") pod \"neutron-2d70-account-create-update-mt7tw\" (UID: \"6bc880e4-546b-4dce-a6eb-39aef7dd9954\") " pod="openstack/neutron-2d70-account-create-update-mt7tw" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.969053 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tcsd\" (UniqueName: \"kubernetes.io/projected/a00825d4-1eac-45e1-9eec-bdc12eb450a2-kube-api-access-9tcsd\") pod \"neutron-db-create-gjb7f\" (UID: \"a00825d4-1eac-45e1-9eec-bdc12eb450a2\") " pod="openstack/neutron-db-create-gjb7f" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.970649 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlrxt\" (UniqueName: \"kubernetes.io/projected/b50774ab-6273-44c8-ae0c-d6c77c3996fc-kube-api-access-dlrxt\") pod \"keystone-db-sync-8mhpn\" (UID: \"b50774ab-6273-44c8-ae0c-d6c77c3996fc\") " pod="openstack/keystone-db-sync-8mhpn" Mar 18 12:33:22 crc kubenswrapper[4975]: I0318 12:33:22.992973 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vp6r7" Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.000928 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d456-account-create-update-htdrw" Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.027854 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b62c54-4223-4370-bd91-e5a23eb874cf" path="/var/lib/kubelet/pods/f9b62c54-4223-4370-bd91-e5a23eb874cf/volumes" Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.136374 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gjb7f" Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.149351 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8mhpn" Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.158476 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gkcrs"] Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.162144 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2d70-account-create-update-mt7tw" Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.232151 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-647b-account-create-update-d6jph"] Mar 18 12:33:23 crc kubenswrapper[4975]: W0318 12:33:23.260376 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72df1add_3fec_4e3b_baad_90ba808298f7.slice/crio-34a25cef535a51ce3c6d229a0eaaeeec731097407030f7b6daf64a1c1028dadb WatchSource:0}: Error finding container 34a25cef535a51ce3c6d229a0eaaeeec731097407030f7b6daf64a1c1028dadb: Status 404 returned error can't find the container with id 34a25cef535a51ce3c6d229a0eaaeeec731097407030f7b6daf64a1c1028dadb Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.473230 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vp6r7"] Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.599200 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d456-account-create-update-htdrw"] Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.754983 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2d70-account-create-update-mt7tw"] Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.781288 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8mhpn"] Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.818478 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gjb7f"] Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.911597 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vp6r7" event={"ID":"ac2a33aa-f2b3-4da7-9265-cc73a33649d3","Type":"ContainerStarted","Data":"b8333029dfa49866152b1dab95a4570851861422cd3d68a638458e3ac167d482"} Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.918850 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8mhpn" event={"ID":"b50774ab-6273-44c8-ae0c-d6c77c3996fc","Type":"ContainerStarted","Data":"5f5df7397ce755f757e5ebac041983f88532d48dd00ef0de2bf9f62ffa521524"} Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.920557 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-647b-account-create-update-d6jph" event={"ID":"72df1add-3fec-4e3b-baad-90ba808298f7","Type":"ContainerStarted","Data":"72d88262648a062e2023179524c3ef465e7bbbc64312d1ad67b60987e18b1e52"} Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.920614 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-647b-account-create-update-d6jph" event={"ID":"72df1add-3fec-4e3b-baad-90ba808298f7","Type":"ContainerStarted","Data":"34a25cef535a51ce3c6d229a0eaaeeec731097407030f7b6daf64a1c1028dadb"} Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.922076 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d456-account-create-update-htdrw" event={"ID":"91116f39-0e09-4921-8ee3-aaff9a89b610","Type":"ContainerStarted","Data":"02c0571f4c14ebe2814c852d0fc798117164060560d98537641c6950497a0d49"} Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.930208 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gjb7f" event={"ID":"a00825d4-1eac-45e1-9eec-bdc12eb450a2","Type":"ContainerStarted","Data":"d5e3611aee575ccdab2d276fbe7eaf359d88ba3a93cf16275943255ca827ca08"} Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.932770 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2d70-account-create-update-mt7tw" event={"ID":"6bc880e4-546b-4dce-a6eb-39aef7dd9954","Type":"ContainerStarted","Data":"53270af6e5062f7f5f1d6f9370b3bad142b944558913b8ec60ce4d01d6c99ecd"} Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.936094 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gkcrs" event={"ID":"0c17426b-2e12-4e49-aaee-cb9d0f438552","Type":"ContainerStarted","Data":"92b732168dd7402750871af12b4e19521bfdb46733c09d8f9d4d650652d6cf20"} Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.936125 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gkcrs" event={"ID":"0c17426b-2e12-4e49-aaee-cb9d0f438552","Type":"ContainerStarted","Data":"43b4870f2c81de2c10cb654477546e6cb9a13125dad9cfa8723ac7a595a69fe1"} Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.953853 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-647b-account-create-update-d6jph" podStartSLOduration=1.9538313120000002 podStartE2EDuration="1.953831312s" podCreationTimestamp="2026-03-18 12:33:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:23.943005644 +0000 UTC m=+1389.657406213" watchObservedRunningTime="2026-03-18 12:33:23.953831312 +0000 UTC m=+1389.668231891" Mar 18 12:33:23 crc kubenswrapper[4975]: I0318 12:33:23.967225 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-gkcrs" podStartSLOduration=1.9672070499999998 podStartE2EDuration="1.96720705s" podCreationTimestamp="2026-03-18 12:33:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:23.959912339 +0000 UTC m=+1389.674312918" watchObservedRunningTime="2026-03-18 12:33:23.96720705 +0000 UTC m=+1389.681607629" Mar 18 12:33:24 crc kubenswrapper[4975]: I0318 12:33:24.947133 4975 generic.go:334] "Generic (PLEG): container finished" podID="ac2a33aa-f2b3-4da7-9265-cc73a33649d3" containerID="63d25e271b98d1b2df4b559ffb40d90f136c66fc33c380f3b9ab175917efa285" exitCode=0 Mar 18 12:33:24 crc kubenswrapper[4975]: I0318 12:33:24.948194 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vp6r7" event={"ID":"ac2a33aa-f2b3-4da7-9265-cc73a33649d3","Type":"ContainerDied","Data":"63d25e271b98d1b2df4b559ffb40d90f136c66fc33c380f3b9ab175917efa285"} Mar 18 12:33:24 crc kubenswrapper[4975]: I0318 12:33:24.949906 4975 generic.go:334] "Generic (PLEG): container finished" podID="72df1add-3fec-4e3b-baad-90ba808298f7" containerID="72d88262648a062e2023179524c3ef465e7bbbc64312d1ad67b60987e18b1e52" exitCode=0 Mar 18 12:33:24 crc kubenswrapper[4975]: I0318 12:33:24.949985 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-647b-account-create-update-d6jph" event={"ID":"72df1add-3fec-4e3b-baad-90ba808298f7","Type":"ContainerDied","Data":"72d88262648a062e2023179524c3ef465e7bbbc64312d1ad67b60987e18b1e52"} Mar 18 12:33:24 crc kubenswrapper[4975]: I0318 12:33:24.952742 4975 generic.go:334] "Generic (PLEG): container finished" podID="91116f39-0e09-4921-8ee3-aaff9a89b610" containerID="826b860e46717b42db2c81c919f5720f2d1e3c5faa1dc267d11352c74a9fb51c" exitCode=0 Mar 18 12:33:24 crc kubenswrapper[4975]: I0318 12:33:24.952807 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d456-account-create-update-htdrw" event={"ID":"91116f39-0e09-4921-8ee3-aaff9a89b610","Type":"ContainerDied","Data":"826b860e46717b42db2c81c919f5720f2d1e3c5faa1dc267d11352c74a9fb51c"} Mar 18 12:33:24 crc kubenswrapper[4975]: I0318 12:33:24.955279 4975 generic.go:334] "Generic (PLEG): container finished" podID="a00825d4-1eac-45e1-9eec-bdc12eb450a2" containerID="134404831f5c1ede754e310da20064f28e8964e6c417b020d3e2cdd14a668ceb" exitCode=0 Mar 18 12:33:24 crc kubenswrapper[4975]: I0318 12:33:24.955323 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gjb7f" event={"ID":"a00825d4-1eac-45e1-9eec-bdc12eb450a2","Type":"ContainerDied","Data":"134404831f5c1ede754e310da20064f28e8964e6c417b020d3e2cdd14a668ceb"} Mar 18 12:33:24 crc kubenswrapper[4975]: I0318 12:33:24.956707 4975 generic.go:334] "Generic (PLEG): container finished" podID="6bc880e4-546b-4dce-a6eb-39aef7dd9954" containerID="fc0f34fee1f1e0cfe92586a01c2acab55226856fa34a477aff135d906daabf98" exitCode=0 Mar 18 12:33:24 crc kubenswrapper[4975]: I0318 12:33:24.956756 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2d70-account-create-update-mt7tw" event={"ID":"6bc880e4-546b-4dce-a6eb-39aef7dd9954","Type":"ContainerDied","Data":"fc0f34fee1f1e0cfe92586a01c2acab55226856fa34a477aff135d906daabf98"} Mar 18 12:33:24 crc kubenswrapper[4975]: I0318 12:33:24.958127 4975 generic.go:334] "Generic (PLEG): container finished" podID="0c17426b-2e12-4e49-aaee-cb9d0f438552" containerID="92b732168dd7402750871af12b4e19521bfdb46733c09d8f9d4d650652d6cf20" exitCode=0 Mar 18 12:33:24 crc kubenswrapper[4975]: I0318 12:33:24.958157 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gkcrs" event={"ID":"0c17426b-2e12-4e49-aaee-cb9d0f438552","Type":"ContainerDied","Data":"92b732168dd7402750871af12b4e19521bfdb46733c09d8f9d4d650652d6cf20"} Mar 18 12:33:25 crc kubenswrapper[4975]: I0318 12:33:25.538475 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:33:25 crc kubenswrapper[4975]: I0318 12:33:25.538885 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:33:25 crc kubenswrapper[4975]: I0318 12:33:25.538949 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:33:25 crc kubenswrapper[4975]: I0318 12:33:25.539765 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d846cb3e61bc67fa3212660cebeebcacd3a57cbf2e5bcba7bd344d98d42cef45"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:33:25 crc kubenswrapper[4975]: I0318 12:33:25.539892 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://d846cb3e61bc67fa3212660cebeebcacd3a57cbf2e5bcba7bd344d98d42cef45" gracePeriod=600 Mar 18 12:33:25 crc kubenswrapper[4975]: I0318 12:33:25.968319 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="d846cb3e61bc67fa3212660cebeebcacd3a57cbf2e5bcba7bd344d98d42cef45" exitCode=0 Mar 18 12:33:25 crc kubenswrapper[4975]: I0318 12:33:25.968442 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"d846cb3e61bc67fa3212660cebeebcacd3a57cbf2e5bcba7bd344d98d42cef45"} Mar 18 12:33:25 crc kubenswrapper[4975]: I0318 12:33:25.968499 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff"} Mar 18 12:33:25 crc kubenswrapper[4975]: I0318 12:33:25.968519 4975 scope.go:117] "RemoveContainer" containerID="b3ad49f3300a39909733b143700abc28ad83ea2ad2f5fc6a9b69e95819adb98f" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.378842 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gjb7f" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.498343 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-647b-account-create-update-d6jph" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.505248 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d456-account-create-update-htdrw" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.505452 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a00825d4-1eac-45e1-9eec-bdc12eb450a2-operator-scripts\") pod \"a00825d4-1eac-45e1-9eec-bdc12eb450a2\" (UID: \"a00825d4-1eac-45e1-9eec-bdc12eb450a2\") " Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.505624 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tcsd\" (UniqueName: \"kubernetes.io/projected/a00825d4-1eac-45e1-9eec-bdc12eb450a2-kube-api-access-9tcsd\") pod \"a00825d4-1eac-45e1-9eec-bdc12eb450a2\" (UID: \"a00825d4-1eac-45e1-9eec-bdc12eb450a2\") " Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.506441 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a00825d4-1eac-45e1-9eec-bdc12eb450a2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a00825d4-1eac-45e1-9eec-bdc12eb450a2" (UID: "a00825d4-1eac-45e1-9eec-bdc12eb450a2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.513351 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a00825d4-1eac-45e1-9eec-bdc12eb450a2-kube-api-access-9tcsd" (OuterVolumeSpecName: "kube-api-access-9tcsd") pod "a00825d4-1eac-45e1-9eec-bdc12eb450a2" (UID: "a00825d4-1eac-45e1-9eec-bdc12eb450a2"). InnerVolumeSpecName "kube-api-access-9tcsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.515055 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2d70-account-create-update-mt7tw" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.525560 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gkcrs" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.534154 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vp6r7" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.607236 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm8pn\" (UniqueName: \"kubernetes.io/projected/72df1add-3fec-4e3b-baad-90ba808298f7-kube-api-access-rm8pn\") pod \"72df1add-3fec-4e3b-baad-90ba808298f7\" (UID: \"72df1add-3fec-4e3b-baad-90ba808298f7\") " Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.607287 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxfpf\" (UniqueName: \"kubernetes.io/projected/ac2a33aa-f2b3-4da7-9265-cc73a33649d3-kube-api-access-dxfpf\") pod \"ac2a33aa-f2b3-4da7-9265-cc73a33649d3\" (UID: \"ac2a33aa-f2b3-4da7-9265-cc73a33649d3\") " Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.607317 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72df1add-3fec-4e3b-baad-90ba808298f7-operator-scripts\") pod \"72df1add-3fec-4e3b-baad-90ba808298f7\" (UID: \"72df1add-3fec-4e3b-baad-90ba808298f7\") " Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.607336 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fn5b\" (UniqueName: \"kubernetes.io/projected/91116f39-0e09-4921-8ee3-aaff9a89b610-kube-api-access-6fn5b\") pod \"91116f39-0e09-4921-8ee3-aaff9a89b610\" (UID: \"91116f39-0e09-4921-8ee3-aaff9a89b610\") " Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.607370 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bc880e4-546b-4dce-a6eb-39aef7dd9954-operator-scripts\") pod \"6bc880e4-546b-4dce-a6eb-39aef7dd9954\" (UID: \"6bc880e4-546b-4dce-a6eb-39aef7dd9954\") " Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.607461 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x296n\" (UniqueName: \"kubernetes.io/projected/6bc880e4-546b-4dce-a6eb-39aef7dd9954-kube-api-access-x296n\") pod \"6bc880e4-546b-4dce-a6eb-39aef7dd9954\" (UID: \"6bc880e4-546b-4dce-a6eb-39aef7dd9954\") " Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.607483 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c17426b-2e12-4e49-aaee-cb9d0f438552-operator-scripts\") pod \"0c17426b-2e12-4e49-aaee-cb9d0f438552\" (UID: \"0c17426b-2e12-4e49-aaee-cb9d0f438552\") " Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.607626 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46zjc\" (UniqueName: \"kubernetes.io/projected/0c17426b-2e12-4e49-aaee-cb9d0f438552-kube-api-access-46zjc\") pod \"0c17426b-2e12-4e49-aaee-cb9d0f438552\" (UID: \"0c17426b-2e12-4e49-aaee-cb9d0f438552\") " Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.607659 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91116f39-0e09-4921-8ee3-aaff9a89b610-operator-scripts\") pod \"91116f39-0e09-4921-8ee3-aaff9a89b610\" (UID: \"91116f39-0e09-4921-8ee3-aaff9a89b610\") " Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.607722 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac2a33aa-f2b3-4da7-9265-cc73a33649d3-operator-scripts\") pod \"ac2a33aa-f2b3-4da7-9265-cc73a33649d3\" (UID: \"ac2a33aa-f2b3-4da7-9265-cc73a33649d3\") " Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.608022 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tcsd\" (UniqueName: \"kubernetes.io/projected/a00825d4-1eac-45e1-9eec-bdc12eb450a2-kube-api-access-9tcsd\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.608033 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a00825d4-1eac-45e1-9eec-bdc12eb450a2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.608453 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc880e4-546b-4dce-a6eb-39aef7dd9954-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bc880e4-546b-4dce-a6eb-39aef7dd9954" (UID: "6bc880e4-546b-4dce-a6eb-39aef7dd9954"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.608466 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c17426b-2e12-4e49-aaee-cb9d0f438552-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c17426b-2e12-4e49-aaee-cb9d0f438552" (UID: "0c17426b-2e12-4e49-aaee-cb9d0f438552"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.608509 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91116f39-0e09-4921-8ee3-aaff9a89b610-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "91116f39-0e09-4921-8ee3-aaff9a89b610" (UID: "91116f39-0e09-4921-8ee3-aaff9a89b610"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.608771 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac2a33aa-f2b3-4da7-9265-cc73a33649d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac2a33aa-f2b3-4da7-9265-cc73a33649d3" (UID: "ac2a33aa-f2b3-4da7-9265-cc73a33649d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.608987 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72df1add-3fec-4e3b-baad-90ba808298f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72df1add-3fec-4e3b-baad-90ba808298f7" (UID: "72df1add-3fec-4e3b-baad-90ba808298f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.611442 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91116f39-0e09-4921-8ee3-aaff9a89b610-kube-api-access-6fn5b" (OuterVolumeSpecName: "kube-api-access-6fn5b") pod "91116f39-0e09-4921-8ee3-aaff9a89b610" (UID: "91116f39-0e09-4921-8ee3-aaff9a89b610"). InnerVolumeSpecName "kube-api-access-6fn5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.611493 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac2a33aa-f2b3-4da7-9265-cc73a33649d3-kube-api-access-dxfpf" (OuterVolumeSpecName: "kube-api-access-dxfpf") pod "ac2a33aa-f2b3-4da7-9265-cc73a33649d3" (UID: "ac2a33aa-f2b3-4da7-9265-cc73a33649d3"). InnerVolumeSpecName "kube-api-access-dxfpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.612064 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc880e4-546b-4dce-a6eb-39aef7dd9954-kube-api-access-x296n" (OuterVolumeSpecName: "kube-api-access-x296n") pod "6bc880e4-546b-4dce-a6eb-39aef7dd9954" (UID: "6bc880e4-546b-4dce-a6eb-39aef7dd9954"). InnerVolumeSpecName "kube-api-access-x296n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.612817 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72df1add-3fec-4e3b-baad-90ba808298f7-kube-api-access-rm8pn" (OuterVolumeSpecName: "kube-api-access-rm8pn") pod "72df1add-3fec-4e3b-baad-90ba808298f7" (UID: "72df1add-3fec-4e3b-baad-90ba808298f7"). InnerVolumeSpecName "kube-api-access-rm8pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.615404 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c17426b-2e12-4e49-aaee-cb9d0f438552-kube-api-access-46zjc" (OuterVolumeSpecName: "kube-api-access-46zjc") pod "0c17426b-2e12-4e49-aaee-cb9d0f438552" (UID: "0c17426b-2e12-4e49-aaee-cb9d0f438552"). InnerVolumeSpecName "kube-api-access-46zjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.709573 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x296n\" (UniqueName: \"kubernetes.io/projected/6bc880e4-546b-4dce-a6eb-39aef7dd9954-kube-api-access-x296n\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.709610 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c17426b-2e12-4e49-aaee-cb9d0f438552-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.709622 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46zjc\" (UniqueName: \"kubernetes.io/projected/0c17426b-2e12-4e49-aaee-cb9d0f438552-kube-api-access-46zjc\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.709634 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91116f39-0e09-4921-8ee3-aaff9a89b610-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.709645 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac2a33aa-f2b3-4da7-9265-cc73a33649d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.709658 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm8pn\" (UniqueName: \"kubernetes.io/projected/72df1add-3fec-4e3b-baad-90ba808298f7-kube-api-access-rm8pn\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.709668 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxfpf\" (UniqueName: \"kubernetes.io/projected/ac2a33aa-f2b3-4da7-9265-cc73a33649d3-kube-api-access-dxfpf\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.709680 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72df1add-3fec-4e3b-baad-90ba808298f7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.709690 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fn5b\" (UniqueName: \"kubernetes.io/projected/91116f39-0e09-4921-8ee3-aaff9a89b610-kube-api-access-6fn5b\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.709701 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bc880e4-546b-4dce-a6eb-39aef7dd9954-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.979346 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gjb7f" event={"ID":"a00825d4-1eac-45e1-9eec-bdc12eb450a2","Type":"ContainerDied","Data":"d5e3611aee575ccdab2d276fbe7eaf359d88ba3a93cf16275943255ca827ca08"} Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.979394 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5e3611aee575ccdab2d276fbe7eaf359d88ba3a93cf16275943255ca827ca08" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.979470 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gjb7f" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.984591 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2d70-account-create-update-mt7tw" event={"ID":"6bc880e4-546b-4dce-a6eb-39aef7dd9954","Type":"ContainerDied","Data":"53270af6e5062f7f5f1d6f9370b3bad142b944558913b8ec60ce4d01d6c99ecd"} Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.984832 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53270af6e5062f7f5f1d6f9370b3bad142b944558913b8ec60ce4d01d6c99ecd" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.985055 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2d70-account-create-update-mt7tw" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.986543 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gkcrs" event={"ID":"0c17426b-2e12-4e49-aaee-cb9d0f438552","Type":"ContainerDied","Data":"43b4870f2c81de2c10cb654477546e6cb9a13125dad9cfa8723ac7a595a69fe1"} Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.986591 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43b4870f2c81de2c10cb654477546e6cb9a13125dad9cfa8723ac7a595a69fe1" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.986682 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gkcrs" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.988396 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vp6r7" event={"ID":"ac2a33aa-f2b3-4da7-9265-cc73a33649d3","Type":"ContainerDied","Data":"b8333029dfa49866152b1dab95a4570851861422cd3d68a638458e3ac167d482"} Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.988447 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8333029dfa49866152b1dab95a4570851861422cd3d68a638458e3ac167d482" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.988561 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vp6r7" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.994314 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-647b-account-create-update-d6jph" event={"ID":"72df1add-3fec-4e3b-baad-90ba808298f7","Type":"ContainerDied","Data":"34a25cef535a51ce3c6d229a0eaaeeec731097407030f7b6daf64a1c1028dadb"} Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.994367 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34a25cef535a51ce3c6d229a0eaaeeec731097407030f7b6daf64a1c1028dadb" Mar 18 12:33:26 crc kubenswrapper[4975]: I0318 12:33:26.996626 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-647b-account-create-update-d6jph" Mar 18 12:33:27 crc kubenswrapper[4975]: I0318 12:33:27.001335 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d456-account-create-update-htdrw" event={"ID":"91116f39-0e09-4921-8ee3-aaff9a89b610","Type":"ContainerDied","Data":"02c0571f4c14ebe2814c852d0fc798117164060560d98537641c6950497a0d49"} Mar 18 12:33:27 crc kubenswrapper[4975]: I0318 12:33:27.001428 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02c0571f4c14ebe2814c852d0fc798117164060560d98537641c6950497a0d49" Mar 18 12:33:27 crc kubenswrapper[4975]: I0318 12:33:27.001425 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d456-account-create-update-htdrw" Mar 18 12:33:29 crc kubenswrapper[4975]: I0318 12:33:29.605041 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:33:29 crc kubenswrapper[4975]: I0318 12:33:29.669486 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-c56q4"] Mar 18 12:33:29 crc kubenswrapper[4975]: I0318 12:33:29.669833 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-c56q4" podUID="337ec784-956c-49f3-a71b-93bdd815f447" containerName="dnsmasq-dns" containerID="cri-o://2dbd35310d405c295867572cf6ccec20f0019c8163a95de9a70ccdaebb7ab74f" gracePeriod=10 Mar 18 12:33:30 crc kubenswrapper[4975]: I0318 12:33:30.024765 4975 generic.go:334] "Generic (PLEG): container finished" podID="337ec784-956c-49f3-a71b-93bdd815f447" containerID="2dbd35310d405c295867572cf6ccec20f0019c8163a95de9a70ccdaebb7ab74f" exitCode=0 Mar 18 12:33:30 crc kubenswrapper[4975]: I0318 12:33:30.024817 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-c56q4" event={"ID":"337ec784-956c-49f3-a71b-93bdd815f447","Type":"ContainerDied","Data":"2dbd35310d405c295867572cf6ccec20f0019c8163a95de9a70ccdaebb7ab74f"} Mar 18 12:33:31 crc kubenswrapper[4975]: I0318 12:33:31.328611 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-c56q4" podUID="337ec784-956c-49f3-a71b-93bdd815f447" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.139513 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.197033 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-ovsdbserver-sb\") pod \"337ec784-956c-49f3-a71b-93bdd815f447\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.197163 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvspw\" (UniqueName: \"kubernetes.io/projected/337ec784-956c-49f3-a71b-93bdd815f447-kube-api-access-dvspw\") pod \"337ec784-956c-49f3-a71b-93bdd815f447\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.198090 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-ovsdbserver-nb\") pod \"337ec784-956c-49f3-a71b-93bdd815f447\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.198190 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-config\") pod \"337ec784-956c-49f3-a71b-93bdd815f447\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.198295 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-dns-svc\") pod \"337ec784-956c-49f3-a71b-93bdd815f447\" (UID: \"337ec784-956c-49f3-a71b-93bdd815f447\") " Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.202536 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/337ec784-956c-49f3-a71b-93bdd815f447-kube-api-access-dvspw" (OuterVolumeSpecName: "kube-api-access-dvspw") pod "337ec784-956c-49f3-a71b-93bdd815f447" (UID: "337ec784-956c-49f3-a71b-93bdd815f447"). InnerVolumeSpecName "kube-api-access-dvspw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.245468 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-config" (OuterVolumeSpecName: "config") pod "337ec784-956c-49f3-a71b-93bdd815f447" (UID: "337ec784-956c-49f3-a71b-93bdd815f447"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.247578 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "337ec784-956c-49f3-a71b-93bdd815f447" (UID: "337ec784-956c-49f3-a71b-93bdd815f447"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.248373 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "337ec784-956c-49f3-a71b-93bdd815f447" (UID: "337ec784-956c-49f3-a71b-93bdd815f447"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.254636 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "337ec784-956c-49f3-a71b-93bdd815f447" (UID: "337ec784-956c-49f3-a71b-93bdd815f447"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.300109 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.300288 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvspw\" (UniqueName: \"kubernetes.io/projected/337ec784-956c-49f3-a71b-93bdd815f447-kube-api-access-dvspw\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.300300 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.300310 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:32 crc kubenswrapper[4975]: I0318 12:33:32.300319 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/337ec784-956c-49f3-a71b-93bdd815f447-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:33 crc kubenswrapper[4975]: I0318 12:33:33.049539 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8mhpn" event={"ID":"b50774ab-6273-44c8-ae0c-d6c77c3996fc","Type":"ContainerStarted","Data":"afaf64061f40eb306f05a4ce6035eb6909d7d8ac9907525794e29641facdff88"} Mar 18 12:33:33 crc kubenswrapper[4975]: I0318 12:33:33.052695 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-c56q4" event={"ID":"337ec784-956c-49f3-a71b-93bdd815f447","Type":"ContainerDied","Data":"7e1bfd11542b73478044fdc17964d40bbadd30900e3c372c7a520e8168853c8e"} Mar 18 12:33:33 crc kubenswrapper[4975]: I0318 12:33:33.052739 4975 scope.go:117] "RemoveContainer" containerID="2dbd35310d405c295867572cf6ccec20f0019c8163a95de9a70ccdaebb7ab74f" Mar 18 12:33:33 crc kubenswrapper[4975]: I0318 12:33:33.052844 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-c56q4" Mar 18 12:33:33 crc kubenswrapper[4975]: I0318 12:33:33.072334 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-8mhpn" podStartSLOduration=3.002285188 podStartE2EDuration="11.072315423s" podCreationTimestamp="2026-03-18 12:33:22 +0000 UTC" firstStartedPulling="2026-03-18 12:33:23.818623278 +0000 UTC m=+1389.533023857" lastFinishedPulling="2026-03-18 12:33:31.888653523 +0000 UTC m=+1397.603054092" observedRunningTime="2026-03-18 12:33:33.068363934 +0000 UTC m=+1398.782764513" watchObservedRunningTime="2026-03-18 12:33:33.072315423 +0000 UTC m=+1398.786716002" Mar 18 12:33:33 crc kubenswrapper[4975]: I0318 12:33:33.078065 4975 scope.go:117] "RemoveContainer" containerID="1717431bdef2e1ed9d2e923c72aac56021c20bbab74d29fb46a0a02ed944a7ca" Mar 18 12:33:33 crc kubenswrapper[4975]: I0318 12:33:33.098785 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-c56q4"] Mar 18 12:33:33 crc kubenswrapper[4975]: I0318 12:33:33.105766 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-c56q4"] Mar 18 12:33:35 crc kubenswrapper[4975]: I0318 12:33:35.026018 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="337ec784-956c-49f3-a71b-93bdd815f447" path="/var/lib/kubelet/pods/337ec784-956c-49f3-a71b-93bdd815f447/volumes" Mar 18 12:33:35 crc kubenswrapper[4975]: I0318 12:33:35.070720 4975 generic.go:334] "Generic (PLEG): container finished" podID="b50774ab-6273-44c8-ae0c-d6c77c3996fc" containerID="afaf64061f40eb306f05a4ce6035eb6909d7d8ac9907525794e29641facdff88" exitCode=0 Mar 18 12:33:35 crc kubenswrapper[4975]: I0318 12:33:35.070776 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8mhpn" event={"ID":"b50774ab-6273-44c8-ae0c-d6c77c3996fc","Type":"ContainerDied","Data":"afaf64061f40eb306f05a4ce6035eb6909d7d8ac9907525794e29641facdff88"} Mar 18 12:33:36 crc kubenswrapper[4975]: I0318 12:33:36.431184 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8mhpn" Mar 18 12:33:36 crc kubenswrapper[4975]: I0318 12:33:36.470663 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b50774ab-6273-44c8-ae0c-d6c77c3996fc-config-data\") pod \"b50774ab-6273-44c8-ae0c-d6c77c3996fc\" (UID: \"b50774ab-6273-44c8-ae0c-d6c77c3996fc\") " Mar 18 12:33:36 crc kubenswrapper[4975]: I0318 12:33:36.470711 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b50774ab-6273-44c8-ae0c-d6c77c3996fc-combined-ca-bundle\") pod \"b50774ab-6273-44c8-ae0c-d6c77c3996fc\" (UID: \"b50774ab-6273-44c8-ae0c-d6c77c3996fc\") " Mar 18 12:33:36 crc kubenswrapper[4975]: I0318 12:33:36.470758 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlrxt\" (UniqueName: \"kubernetes.io/projected/b50774ab-6273-44c8-ae0c-d6c77c3996fc-kube-api-access-dlrxt\") pod \"b50774ab-6273-44c8-ae0c-d6c77c3996fc\" (UID: \"b50774ab-6273-44c8-ae0c-d6c77c3996fc\") " Mar 18 12:33:36 crc kubenswrapper[4975]: I0318 12:33:36.478311 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b50774ab-6273-44c8-ae0c-d6c77c3996fc-kube-api-access-dlrxt" (OuterVolumeSpecName: "kube-api-access-dlrxt") pod "b50774ab-6273-44c8-ae0c-d6c77c3996fc" (UID: "b50774ab-6273-44c8-ae0c-d6c77c3996fc"). InnerVolumeSpecName "kube-api-access-dlrxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:36 crc kubenswrapper[4975]: I0318 12:33:36.498813 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b50774ab-6273-44c8-ae0c-d6c77c3996fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b50774ab-6273-44c8-ae0c-d6c77c3996fc" (UID: "b50774ab-6273-44c8-ae0c-d6c77c3996fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:36 crc kubenswrapper[4975]: I0318 12:33:36.516197 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b50774ab-6273-44c8-ae0c-d6c77c3996fc-config-data" (OuterVolumeSpecName: "config-data") pod "b50774ab-6273-44c8-ae0c-d6c77c3996fc" (UID: "b50774ab-6273-44c8-ae0c-d6c77c3996fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:36 crc kubenswrapper[4975]: I0318 12:33:36.573630 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b50774ab-6273-44c8-ae0c-d6c77c3996fc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:36 crc kubenswrapper[4975]: I0318 12:33:36.573678 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b50774ab-6273-44c8-ae0c-d6c77c3996fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:36 crc kubenswrapper[4975]: I0318 12:33:36.573690 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlrxt\" (UniqueName: \"kubernetes.io/projected/b50774ab-6273-44c8-ae0c-d6c77c3996fc-kube-api-access-dlrxt\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.092088 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8mhpn" event={"ID":"b50774ab-6273-44c8-ae0c-d6c77c3996fc","Type":"ContainerDied","Data":"5f5df7397ce755f757e5ebac041983f88532d48dd00ef0de2bf9f62ffa521524"} Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.092129 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f5df7397ce755f757e5ebac041983f88532d48dd00ef0de2bf9f62ffa521524" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.092196 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8mhpn" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.359398 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-r4j6c"] Mar 18 12:33:37 crc kubenswrapper[4975]: E0318 12:33:37.359754 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72df1add-3fec-4e3b-baad-90ba808298f7" containerName="mariadb-account-create-update" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.359777 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="72df1add-3fec-4e3b-baad-90ba808298f7" containerName="mariadb-account-create-update" Mar 18 12:33:37 crc kubenswrapper[4975]: E0318 12:33:37.359787 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc880e4-546b-4dce-a6eb-39aef7dd9954" containerName="mariadb-account-create-update" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.359795 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc880e4-546b-4dce-a6eb-39aef7dd9954" containerName="mariadb-account-create-update" Mar 18 12:33:37 crc kubenswrapper[4975]: E0318 12:33:37.359808 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50774ab-6273-44c8-ae0c-d6c77c3996fc" containerName="keystone-db-sync" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.359818 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50774ab-6273-44c8-ae0c-d6c77c3996fc" containerName="keystone-db-sync" Mar 18 12:33:37 crc kubenswrapper[4975]: E0318 12:33:37.359834 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac2a33aa-f2b3-4da7-9265-cc73a33649d3" containerName="mariadb-database-create" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.359841 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2a33aa-f2b3-4da7-9265-cc73a33649d3" containerName="mariadb-database-create" Mar 18 12:33:37 crc kubenswrapper[4975]: E0318 12:33:37.359859 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="337ec784-956c-49f3-a71b-93bdd815f447" containerName="init" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.359886 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="337ec784-956c-49f3-a71b-93bdd815f447" containerName="init" Mar 18 12:33:37 crc kubenswrapper[4975]: E0318 12:33:37.359903 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91116f39-0e09-4921-8ee3-aaff9a89b610" containerName="mariadb-account-create-update" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.359912 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="91116f39-0e09-4921-8ee3-aaff9a89b610" containerName="mariadb-account-create-update" Mar 18 12:33:37 crc kubenswrapper[4975]: E0318 12:33:37.359921 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="337ec784-956c-49f3-a71b-93bdd815f447" containerName="dnsmasq-dns" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.359927 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="337ec784-956c-49f3-a71b-93bdd815f447" containerName="dnsmasq-dns" Mar 18 12:33:37 crc kubenswrapper[4975]: E0318 12:33:37.359942 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c17426b-2e12-4e49-aaee-cb9d0f438552" containerName="mariadb-database-create" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.359947 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c17426b-2e12-4e49-aaee-cb9d0f438552" containerName="mariadb-database-create" Mar 18 12:33:37 crc kubenswrapper[4975]: E0318 12:33:37.359967 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a00825d4-1eac-45e1-9eec-bdc12eb450a2" containerName="mariadb-database-create" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.359973 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00825d4-1eac-45e1-9eec-bdc12eb450a2" containerName="mariadb-database-create" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.360120 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc880e4-546b-4dce-a6eb-39aef7dd9954" containerName="mariadb-account-create-update" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.360133 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac2a33aa-f2b3-4da7-9265-cc73a33649d3" containerName="mariadb-database-create" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.360145 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="a00825d4-1eac-45e1-9eec-bdc12eb450a2" containerName="mariadb-database-create" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.360151 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b50774ab-6273-44c8-ae0c-d6c77c3996fc" containerName="keystone-db-sync" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.360161 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="337ec784-956c-49f3-a71b-93bdd815f447" containerName="dnsmasq-dns" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.360173 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="91116f39-0e09-4921-8ee3-aaff9a89b610" containerName="mariadb-account-create-update" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.360182 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="72df1add-3fec-4e3b-baad-90ba808298f7" containerName="mariadb-account-create-update" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.360190 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c17426b-2e12-4e49-aaee-cb9d0f438552" containerName="mariadb-database-create" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.363233 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.368656 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-n9cbk"] Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.369714 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.381931 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.382110 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.382214 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d6c9r" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.382325 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.382525 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.402735 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-r4j6c"] Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.409343 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n9cbk"] Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.488426 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-fernet-keys\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.488472 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-credential-keys\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.488507 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ws5x\" (UniqueName: \"kubernetes.io/projected/3b31a5d0-e54f-48c1-90a1-c83e32592b42-kube-api-access-6ws5x\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.488552 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.488663 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-config-data\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.488701 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.488797 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.488849 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-dns-svc\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.488915 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-config\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.488998 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9qt5\" (UniqueName: \"kubernetes.io/projected/142bd2ba-bbcb-4e91-9365-670839ea9f5a-kube-api-access-c9qt5\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.489044 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-scripts\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.489071 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-combined-ca-bundle\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.580955 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8f5b85cf7-qxcrm"] Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.582707 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.583691 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-m6lmd"] Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.584777 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.585385 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.585729 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.596434 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-config-data\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.596483 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.596518 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.596537 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-dns-svc\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.596559 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-config\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.596588 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9qt5\" (UniqueName: \"kubernetes.io/projected/142bd2ba-bbcb-4e91-9365-670839ea9f5a-kube-api-access-c9qt5\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.596609 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-scripts\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.596627 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-combined-ca-bundle\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.596657 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-fernet-keys\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.596679 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-credential-keys\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.596704 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ws5x\" (UniqueName: \"kubernetes.io/projected/3b31a5d0-e54f-48c1-90a1-c83e32592b42-kube-api-access-6ws5x\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.596745 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.597604 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.598753 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.598925 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-n25s7" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.599141 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.599475 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.599760 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hwdsz" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.600321 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.603528 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-fernet-keys\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.611944 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-config\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.612389 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-credential-keys\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.612529 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.624635 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-scripts\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.624765 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-dns-svc\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.633844 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-combined-ca-bundle\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.653528 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-config-data\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.665824 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ws5x\" (UniqueName: \"kubernetes.io/projected/3b31a5d0-e54f-48c1-90a1-c83e32592b42-kube-api-access-6ws5x\") pod \"dnsmasq-dns-847c4cc679-r4j6c\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.673340 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9qt5\" (UniqueName: \"kubernetes.io/projected/142bd2ba-bbcb-4e91-9365-670839ea9f5a-kube-api-access-c9qt5\") pod \"keystone-bootstrap-n9cbk\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.681938 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8f5b85cf7-qxcrm"] Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.698209 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-scripts\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.698274 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-db-sync-config-data\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.698321 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-scripts\") pod \"horizon-8f5b85cf7-qxcrm\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.698341 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-config-data\") pod \"horizon-8f5b85cf7-qxcrm\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.698359 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-config-data\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.698383 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-etc-machine-id\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.698397 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64g7s\" (UniqueName: \"kubernetes.io/projected/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-kube-api-access-64g7s\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.698414 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-logs\") pod \"horizon-8f5b85cf7-qxcrm\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.698459 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-combined-ca-bundle\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.698486 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdckf\" (UniqueName: \"kubernetes.io/projected/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-kube-api-access-qdckf\") pod \"horizon-8f5b85cf7-qxcrm\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.698504 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-horizon-secret-key\") pod \"horizon-8f5b85cf7-qxcrm\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.698718 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.721663 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.725237 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-m6lmd"] Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.799719 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-combined-ca-bundle\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.799784 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdckf\" (UniqueName: \"kubernetes.io/projected/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-kube-api-access-qdckf\") pod \"horizon-8f5b85cf7-qxcrm\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.799804 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-horizon-secret-key\") pod \"horizon-8f5b85cf7-qxcrm\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.799853 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-scripts\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.799893 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-db-sync-config-data\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.799929 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-scripts\") pod \"horizon-8f5b85cf7-qxcrm\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.799945 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-config-data\") pod \"horizon-8f5b85cf7-qxcrm\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.799967 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-config-data\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.799999 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-etc-machine-id\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.800017 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64g7s\" (UniqueName: \"kubernetes.io/projected/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-kube-api-access-64g7s\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.800032 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-logs\") pod \"horizon-8f5b85cf7-qxcrm\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.800425 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-logs\") pod \"horizon-8f5b85cf7-qxcrm\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.805086 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-combined-ca-bundle\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.805649 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-scripts\") pod \"horizon-8f5b85cf7-qxcrm\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.806228 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-config-data\") pod \"horizon-8f5b85cf7-qxcrm\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.810039 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.812613 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-etc-machine-id\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.812627 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.815489 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.816987 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.817567 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-db-sync-config-data\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.818923 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-config-data\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.830062 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-horizon-secret-key\") pod \"horizon-8f5b85cf7-qxcrm\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.840811 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdckf\" (UniqueName: \"kubernetes.io/projected/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-kube-api-access-qdckf\") pod \"horizon-8f5b85cf7-qxcrm\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.848450 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-scripts\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.867613 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64g7s\" (UniqueName: \"kubernetes.io/projected/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-kube-api-access-64g7s\") pod \"cinder-db-sync-m6lmd\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.890667 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-9bbxb"] Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.895718 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9bbxb" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.902886 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-run-httpd\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.903234 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-log-httpd\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.903374 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhb2m\" (UniqueName: \"kubernetes.io/projected/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-kube-api-access-nhb2m\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.903480 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-scripts\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.903633 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.903793 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-config-data\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.903916 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.907467 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.907696 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.907835 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sstlq" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.937373 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.949971 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9bbxb"] Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.977777 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-hwtrk"] Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.985486 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.990765 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.990960 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rkhlv" Mar 18 12:33:37 crc kubenswrapper[4975]: I0318 12:33:37.991165 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.002219 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hwtrk"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.009597 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-scripts\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.009650 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-config-data\") pod \"placement-db-sync-hwtrk\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.009720 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.009798 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-config-data\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.009826 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgtlc\" (UniqueName: \"kubernetes.io/projected/f85a1328-170d-4be7-8115-36d400fd7645-kube-api-access-pgtlc\") pod \"neutron-db-sync-9bbxb\" (UID: \"f85a1328-170d-4be7-8115-36d400fd7645\") " pod="openstack/neutron-db-sync-9bbxb" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.009861 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.009947 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-combined-ca-bundle\") pod \"placement-db-sync-hwtrk\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.009974 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzjdq\" (UniqueName: \"kubernetes.io/projected/c2614112-7379-4588-a6dd-1cec6e3d96b4-kube-api-access-qzjdq\") pod \"placement-db-sync-hwtrk\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.009998 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-run-httpd\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.010022 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-scripts\") pod \"placement-db-sync-hwtrk\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.010050 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85a1328-170d-4be7-8115-36d400fd7645-combined-ca-bundle\") pod \"neutron-db-sync-9bbxb\" (UID: \"f85a1328-170d-4be7-8115-36d400fd7645\") " pod="openstack/neutron-db-sync-9bbxb" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.010113 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2614112-7379-4588-a6dd-1cec6e3d96b4-logs\") pod \"placement-db-sync-hwtrk\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.010156 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-log-httpd\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.010189 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f85a1328-170d-4be7-8115-36d400fd7645-config\") pod \"neutron-db-sync-9bbxb\" (UID: \"f85a1328-170d-4be7-8115-36d400fd7645\") " pod="openstack/neutron-db-sync-9bbxb" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.010226 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhb2m\" (UniqueName: \"kubernetes.io/projected/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-kube-api-access-nhb2m\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.019180 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.021811 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-scripts\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.022903 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-r4j6c"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.025983 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-log-httpd\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.028412 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-run-httpd\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.032757 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-config-data\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.034462 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhb2m\" (UniqueName: \"kubernetes.io/projected/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-kube-api-access-nhb2m\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.034522 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.035158 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " pod="openstack/ceilometer-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.038318 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-psw5k"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.039450 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-psw5k" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.056792 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-pjgrx" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.070494 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.094522 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f78496f49-fklmw"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.096980 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.106082 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.112661 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgtlc\" (UniqueName: \"kubernetes.io/projected/f85a1328-170d-4be7-8115-36d400fd7645-kube-api-access-pgtlc\") pod \"neutron-db-sync-9bbxb\" (UID: \"f85a1328-170d-4be7-8115-36d400fd7645\") " pod="openstack/neutron-db-sync-9bbxb" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.112702 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f054ebc-151c-4e89-8242-3837c9bee6b2-combined-ca-bundle\") pod \"barbican-db-sync-psw5k\" (UID: \"4f054ebc-151c-4e89-8242-3837c9bee6b2\") " pod="openstack/barbican-db-sync-psw5k" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.112789 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-combined-ca-bundle\") pod \"placement-db-sync-hwtrk\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.112817 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzjdq\" (UniqueName: \"kubernetes.io/projected/c2614112-7379-4588-a6dd-1cec6e3d96b4-kube-api-access-qzjdq\") pod \"placement-db-sync-hwtrk\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.112839 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-scripts\") pod \"placement-db-sync-hwtrk\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.112877 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85a1328-170d-4be7-8115-36d400fd7645-combined-ca-bundle\") pod \"neutron-db-sync-9bbxb\" (UID: \"f85a1328-170d-4be7-8115-36d400fd7645\") " pod="openstack/neutron-db-sync-9bbxb" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.112906 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kx7z\" (UniqueName: \"kubernetes.io/projected/06e8f21f-8218-4a64-a302-0f0b8193b9c8-kube-api-access-8kx7z\") pod \"horizon-f78496f49-fklmw\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.112957 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2614112-7379-4588-a6dd-1cec6e3d96b4-logs\") pod \"placement-db-sync-hwtrk\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.112986 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06e8f21f-8218-4a64-a302-0f0b8193b9c8-logs\") pod \"horizon-f78496f49-fklmw\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.113005 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmps4\" (UniqueName: \"kubernetes.io/projected/4f054ebc-151c-4e89-8242-3837c9bee6b2-kube-api-access-cmps4\") pod \"barbican-db-sync-psw5k\" (UID: \"4f054ebc-151c-4e89-8242-3837c9bee6b2\") " pod="openstack/barbican-db-sync-psw5k" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.113039 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f85a1328-170d-4be7-8115-36d400fd7645-config\") pod \"neutron-db-sync-9bbxb\" (UID: \"f85a1328-170d-4be7-8115-36d400fd7645\") " pod="openstack/neutron-db-sync-9bbxb" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.113082 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06e8f21f-8218-4a64-a302-0f0b8193b9c8-scripts\") pod \"horizon-f78496f49-fklmw\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.113108 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-config-data\") pod \"placement-db-sync-hwtrk\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.113130 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4f054ebc-151c-4e89-8242-3837c9bee6b2-db-sync-config-data\") pod \"barbican-db-sync-psw5k\" (UID: \"4f054ebc-151c-4e89-8242-3837c9bee6b2\") " pod="openstack/barbican-db-sync-psw5k" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.113158 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/06e8f21f-8218-4a64-a302-0f0b8193b9c8-horizon-secret-key\") pod \"horizon-f78496f49-fklmw\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.113209 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06e8f21f-8218-4a64-a302-0f0b8193b9c8-config-data\") pod \"horizon-f78496f49-fklmw\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.123046 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2614112-7379-4588-a6dd-1cec6e3d96b4-logs\") pod \"placement-db-sync-hwtrk\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.124517 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-scripts\") pod \"placement-db-sync-hwtrk\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.126404 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-combined-ca-bundle\") pod \"placement-db-sync-hwtrk\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.128798 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85a1328-170d-4be7-8115-36d400fd7645-combined-ca-bundle\") pod \"neutron-db-sync-9bbxb\" (UID: \"f85a1328-170d-4be7-8115-36d400fd7645\") " pod="openstack/neutron-db-sync-9bbxb" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.130148 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-config-data\") pod \"placement-db-sync-hwtrk\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.142182 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-psw5k"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.143069 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgtlc\" (UniqueName: \"kubernetes.io/projected/f85a1328-170d-4be7-8115-36d400fd7645-kube-api-access-pgtlc\") pod \"neutron-db-sync-9bbxb\" (UID: \"f85a1328-170d-4be7-8115-36d400fd7645\") " pod="openstack/neutron-db-sync-9bbxb" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.144023 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzjdq\" (UniqueName: \"kubernetes.io/projected/c2614112-7379-4588-a6dd-1cec6e3d96b4-kube-api-access-qzjdq\") pod \"placement-db-sync-hwtrk\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.152606 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f85a1328-170d-4be7-8115-36d400fd7645-config\") pod \"neutron-db-sync-9bbxb\" (UID: \"f85a1328-170d-4be7-8115-36d400fd7645\") " pod="openstack/neutron-db-sync-9bbxb" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.166970 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f78496f49-fklmw"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.205752 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-p5nbn"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.207424 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-p5nbn"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.207533 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.216467 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kx7z\" (UniqueName: \"kubernetes.io/projected/06e8f21f-8218-4a64-a302-0f0b8193b9c8-kube-api-access-8kx7z\") pod \"horizon-f78496f49-fklmw\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.216559 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-config\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.216583 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.216654 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06e8f21f-8218-4a64-a302-0f0b8193b9c8-logs\") pod \"horizon-f78496f49-fklmw\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.216672 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmps4\" (UniqueName: \"kubernetes.io/projected/4f054ebc-151c-4e89-8242-3837c9bee6b2-kube-api-access-cmps4\") pod \"barbican-db-sync-psw5k\" (UID: \"4f054ebc-151c-4e89-8242-3837c9bee6b2\") " pod="openstack/barbican-db-sync-psw5k" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.216737 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.217883 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06e8f21f-8218-4a64-a302-0f0b8193b9c8-scripts\") pod \"horizon-f78496f49-fklmw\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.218647 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4f054ebc-151c-4e89-8242-3837c9bee6b2-db-sync-config-data\") pod \"barbican-db-sync-psw5k\" (UID: \"4f054ebc-151c-4e89-8242-3837c9bee6b2\") " pod="openstack/barbican-db-sync-psw5k" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.218689 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/06e8f21f-8218-4a64-a302-0f0b8193b9c8-horizon-secret-key\") pod \"horizon-f78496f49-fklmw\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.218717 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.218782 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06e8f21f-8218-4a64-a302-0f0b8193b9c8-config-data\") pod \"horizon-f78496f49-fklmw\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.218809 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.218850 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7wxz\" (UniqueName: \"kubernetes.io/projected/effc30d8-1f26-461c-8555-86410a3f77ba-kube-api-access-v7wxz\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.218934 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f054ebc-151c-4e89-8242-3837c9bee6b2-combined-ca-bundle\") pod \"barbican-db-sync-psw5k\" (UID: \"4f054ebc-151c-4e89-8242-3837c9bee6b2\") " pod="openstack/barbican-db-sync-psw5k" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.219811 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06e8f21f-8218-4a64-a302-0f0b8193b9c8-logs\") pod \"horizon-f78496f49-fklmw\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.226244 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4f054ebc-151c-4e89-8242-3837c9bee6b2-db-sync-config-data\") pod \"barbican-db-sync-psw5k\" (UID: \"4f054ebc-151c-4e89-8242-3837c9bee6b2\") " pod="openstack/barbican-db-sync-psw5k" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.226750 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06e8f21f-8218-4a64-a302-0f0b8193b9c8-scripts\") pod \"horizon-f78496f49-fklmw\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.226831 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06e8f21f-8218-4a64-a302-0f0b8193b9c8-config-data\") pod \"horizon-f78496f49-fklmw\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.227666 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/06e8f21f-8218-4a64-a302-0f0b8193b9c8-horizon-secret-key\") pod \"horizon-f78496f49-fklmw\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.229677 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f054ebc-151c-4e89-8242-3837c9bee6b2-combined-ca-bundle\") pod \"barbican-db-sync-psw5k\" (UID: \"4f054ebc-151c-4e89-8242-3837c9bee6b2\") " pod="openstack/barbican-db-sync-psw5k" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.242091 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kx7z\" (UniqueName: \"kubernetes.io/projected/06e8f21f-8218-4a64-a302-0f0b8193b9c8-kube-api-access-8kx7z\") pod \"horizon-f78496f49-fklmw\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.248322 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmps4\" (UniqueName: \"kubernetes.io/projected/4f054ebc-151c-4e89-8242-3837c9bee6b2-kube-api-access-cmps4\") pod \"barbican-db-sync-psw5k\" (UID: \"4f054ebc-151c-4e89-8242-3837c9bee6b2\") " pod="openstack/barbican-db-sync-psw5k" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.253630 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.254491 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.256222 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.258781 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dpz5h" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.259053 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.260063 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.260436 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.267351 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9bbxb" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.278637 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.327499 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.327549 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7wxz\" (UniqueName: \"kubernetes.io/projected/effc30d8-1f26-461c-8555-86410a3f77ba-kube-api-access-v7wxz\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.327576 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-scripts\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.327599 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.327644 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.327673 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3af7093-3a28-410f-b4e8-642fc96c0a41-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.327699 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3af7093-3a28-410f-b4e8-642fc96c0a41-logs\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.327723 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-config-data\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.327773 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-config\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.327796 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.327814 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l9bh\" (UniqueName: \"kubernetes.io/projected/f3af7093-3a28-410f-b4e8-642fc96c0a41-kube-api-access-9l9bh\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.327860 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.327898 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.327931 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.328725 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.328849 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.328887 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.328965 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-config\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.329319 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.337500 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hwtrk" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.353997 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7wxz\" (UniqueName: \"kubernetes.io/projected/effc30d8-1f26-461c-8555-86410a3f77ba-kube-api-access-v7wxz\") pod \"dnsmasq-dns-785d8bcb8c-p5nbn\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.388482 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-psw5k" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.427621 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.429347 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.429455 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-scripts\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.429483 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.429541 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.429562 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3af7093-3a28-410f-b4e8-642fc96c0a41-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.429589 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3af7093-3a28-410f-b4e8-642fc96c0a41-logs\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.429608 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-config-data\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.429669 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l9bh\" (UniqueName: \"kubernetes.io/projected/f3af7093-3a28-410f-b4e8-642fc96c0a41-kube-api-access-9l9bh\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.429768 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.430438 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3af7093-3a28-410f-b4e8-642fc96c0a41-logs\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.430679 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3af7093-3a28-410f-b4e8-642fc96c0a41-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.434667 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.448824 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-config-data\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.451956 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.453386 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-scripts\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.460784 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l9bh\" (UniqueName: \"kubernetes.io/projected/f3af7093-3a28-410f-b4e8-642fc96c0a41-kube-api-access-9l9bh\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.483502 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.501370 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-r4j6c"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.534124 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:38 crc kubenswrapper[4975]: W0318 12:33:38.572163 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b31a5d0_e54f_48c1_90a1_c83e32592b42.slice/crio-e4ec596aaa2192c9a889fe823bea76dea5b8d564a0ce76130cc24e8402cb7d55 WatchSource:0}: Error finding container e4ec596aaa2192c9a889fe823bea76dea5b8d564a0ce76130cc24e8402cb7d55: Status 404 returned error can't find the container with id e4ec596aaa2192c9a889fe823bea76dea5b8d564a0ce76130cc24e8402cb7d55 Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.579225 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.607583 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n9cbk"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.739120 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8f5b85cf7-qxcrm"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.762944 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.770798 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.775723 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.780577 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.806105 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.849786 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-m6lmd"] Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.854019 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.854063 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.854089 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.854128 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.854158 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpvwr\" (UniqueName: \"kubernetes.io/projected/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-kube-api-access-cpvwr\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.854209 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.854225 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.854281 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-logs\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.955505 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpvwr\" (UniqueName: \"kubernetes.io/projected/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-kube-api-access-cpvwr\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.955627 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.955652 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.955712 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-logs\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.955745 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.955766 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.955790 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.955814 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.956727 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.957526 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-logs\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.957842 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.964513 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.965879 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.965987 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.978758 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpvwr\" (UniqueName: \"kubernetes.io/projected/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-kube-api-access-cpvwr\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:38 crc kubenswrapper[4975]: I0318 12:33:38.980119 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.001842 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.032764 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.176906 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m6lmd" event={"ID":"94f94b61-6738-4ab8-a65f-0d6cf4d86be1","Type":"ContainerStarted","Data":"9ac6e2a53ea68ab3170f3fd8b946e7df4179f5d76279d1aa19efc51ee1c7d350"} Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.180464 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8f5b85cf7-qxcrm" event={"ID":"0e23cd99-78eb-4ab1-bfae-f436a8db17a6","Type":"ContainerStarted","Data":"37a44b1a0bcbc3ddf9a35bb0d1e6ffaaaf040b1dacd72515d009cc7573a9814f"} Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.184558 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n9cbk" event={"ID":"142bd2ba-bbcb-4e91-9365-670839ea9f5a","Type":"ContainerStarted","Data":"fcbbcd1c6a09cf797bc86a46831b977b029b666d71afade3183bd4595afcd59b"} Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.193117 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" event={"ID":"3b31a5d0-e54f-48c1-90a1-c83e32592b42","Type":"ContainerStarted","Data":"e4ec596aaa2192c9a889fe823bea76dea5b8d564a0ce76130cc24e8402cb7d55"} Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.285044 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hwtrk"] Mar 18 12:33:39 crc kubenswrapper[4975]: W0318 12:33:39.322130 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e8f21f_8218_4a64_a302_0f0b8193b9c8.slice/crio-9f040c1062540180477332f0161a1cd77e0c073ae99c49ea94444d7fd81783c0 WatchSource:0}: Error finding container 9f040c1062540180477332f0161a1cd77e0c073ae99c49ea94444d7fd81783c0: Status 404 returned error can't find the container with id 9f040c1062540180477332f0161a1cd77e0c073ae99c49ea94444d7fd81783c0 Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.330730 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.344996 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9bbxb"] Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.353615 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f78496f49-fklmw"] Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.367292 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-psw5k"] Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.433697 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-p5nbn"] Mar 18 12:33:39 crc kubenswrapper[4975]: W0318 12:33:39.433763 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeffc30d8_1f26_461c_8555_86410a3f77ba.slice/crio-535b9c48eed1c02eee3277ce4f35ce988aabd14f0d25a80215c76be9ecc576bc WatchSource:0}: Error finding container 535b9c48eed1c02eee3277ce4f35ce988aabd14f0d25a80215c76be9ecc576bc: Status 404 returned error can't find the container with id 535b9c48eed1c02eee3277ce4f35ce988aabd14f0d25a80215c76be9ecc576bc Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.544064 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.603535 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.678399 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8f5b85cf7-qxcrm"] Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.727081 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-75b95c5765-8zkk2"] Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.779085 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.779184 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.788167 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75b95c5765-8zkk2"] Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.836822 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.864243 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.881882 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ad41b5d2-6357-41a5-b934-01e7feb35657-horizon-secret-key\") pod \"horizon-75b95c5765-8zkk2\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.881942 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad41b5d2-6357-41a5-b934-01e7feb35657-logs\") pod \"horizon-75b95c5765-8zkk2\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.881978 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad41b5d2-6357-41a5-b934-01e7feb35657-config-data\") pod \"horizon-75b95c5765-8zkk2\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.882089 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmctd\" (UniqueName: \"kubernetes.io/projected/ad41b5d2-6357-41a5-b934-01e7feb35657-kube-api-access-vmctd\") pod \"horizon-75b95c5765-8zkk2\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.882151 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad41b5d2-6357-41a5-b934-01e7feb35657-scripts\") pod \"horizon-75b95c5765-8zkk2\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.985827 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ad41b5d2-6357-41a5-b934-01e7feb35657-horizon-secret-key\") pod \"horizon-75b95c5765-8zkk2\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.985898 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad41b5d2-6357-41a5-b934-01e7feb35657-logs\") pod \"horizon-75b95c5765-8zkk2\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.985938 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad41b5d2-6357-41a5-b934-01e7feb35657-config-data\") pod \"horizon-75b95c5765-8zkk2\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.986040 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmctd\" (UniqueName: \"kubernetes.io/projected/ad41b5d2-6357-41a5-b934-01e7feb35657-kube-api-access-vmctd\") pod \"horizon-75b95c5765-8zkk2\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.986092 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad41b5d2-6357-41a5-b934-01e7feb35657-scripts\") pod \"horizon-75b95c5765-8zkk2\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.986602 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad41b5d2-6357-41a5-b934-01e7feb35657-logs\") pod \"horizon-75b95c5765-8zkk2\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.986927 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad41b5d2-6357-41a5-b934-01e7feb35657-scripts\") pod \"horizon-75b95c5765-8zkk2\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:39 crc kubenswrapper[4975]: I0318 12:33:39.987416 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad41b5d2-6357-41a5-b934-01e7feb35657-config-data\") pod \"horizon-75b95c5765-8zkk2\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.012468 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmctd\" (UniqueName: \"kubernetes.io/projected/ad41b5d2-6357-41a5-b934-01e7feb35657-kube-api-access-vmctd\") pod \"horizon-75b95c5765-8zkk2\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.023258 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ad41b5d2-6357-41a5-b934-01e7feb35657-horizon-secret-key\") pod \"horizon-75b95c5765-8zkk2\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.136361 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.204901 4975 generic.go:334] "Generic (PLEG): container finished" podID="effc30d8-1f26-461c-8555-86410a3f77ba" containerID="659e46f93d799d8b7fc87a5866856463f234fc3f6280dbcd0b55c487cdc153e4" exitCode=0 Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.205196 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" event={"ID":"effc30d8-1f26-461c-8555-86410a3f77ba","Type":"ContainerDied","Data":"659e46f93d799d8b7fc87a5866856463f234fc3f6280dbcd0b55c487cdc153e4"} Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.205239 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" event={"ID":"effc30d8-1f26-461c-8555-86410a3f77ba","Type":"ContainerStarted","Data":"535b9c48eed1c02eee3277ce4f35ce988aabd14f0d25a80215c76be9ecc576bc"} Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.215061 4975 generic.go:334] "Generic (PLEG): container finished" podID="3b31a5d0-e54f-48c1-90a1-c83e32592b42" containerID="308a061a773f9b4b27db6a6a53acca6b564dbd7683f8ee897ecb5df126abbf21" exitCode=0 Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.215253 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" event={"ID":"3b31a5d0-e54f-48c1-90a1-c83e32592b42","Type":"ContainerDied","Data":"308a061a773f9b4b27db6a6a53acca6b564dbd7683f8ee897ecb5df126abbf21"} Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.217227 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d","Type":"ContainerStarted","Data":"2df49e2ece685ef95ed0fda6cc561facdcbc6aeed45b80ca461f4960209b3ebf"} Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.218582 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hwtrk" event={"ID":"c2614112-7379-4588-a6dd-1cec6e3d96b4","Type":"ContainerStarted","Data":"afd46f28a4febcc48272cab1d92b756bc87de7856c345057be6239feeca0b484"} Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.220779 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-psw5k" event={"ID":"4f054ebc-151c-4e89-8242-3837c9bee6b2","Type":"ContainerStarted","Data":"c345efd1145a32559e8c7bd484810752d4faa0548998ff6bb04f08c959f06316"} Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.225137 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f78496f49-fklmw" event={"ID":"06e8f21f-8218-4a64-a302-0f0b8193b9c8","Type":"ContainerStarted","Data":"9f040c1062540180477332f0161a1cd77e0c073ae99c49ea94444d7fd81783c0"} Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.237503 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n9cbk" event={"ID":"142bd2ba-bbcb-4e91-9365-670839ea9f5a","Type":"ContainerStarted","Data":"df65c753ec03e658f829bd79d5281c49156d7666c60a74f064d6e8a1c6e09b15"} Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.246545 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ecd7b336-2830-4ce7-ae0c-12881a0c97fb","Type":"ContainerStarted","Data":"3b17c1d8fb30d7869b72fe9f266553dccee1396d34559f581ae6f771e5b253e1"} Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.249262 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3af7093-3a28-410f-b4e8-642fc96c0a41","Type":"ContainerStarted","Data":"a66d59ae4f6e0fd7f6478011586decaacdb75d16e95652c8b124c3d2d103271a"} Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.262791 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9bbxb" event={"ID":"f85a1328-170d-4be7-8115-36d400fd7645","Type":"ContainerStarted","Data":"fcf9119d0dd51cd281c1f3e4788c2924109567cae2489841d9b989aeb57f0389"} Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.262844 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9bbxb" event={"ID":"f85a1328-170d-4be7-8115-36d400fd7645","Type":"ContainerStarted","Data":"2c4f2901057d484d6b93900fac83bb316e9055d02a839f4ada73abe4fe933af2"} Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.278955 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-n9cbk" podStartSLOduration=3.278934519 podStartE2EDuration="3.278934519s" podCreationTimestamp="2026-03-18 12:33:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:40.275537765 +0000 UTC m=+1405.989938424" watchObservedRunningTime="2026-03-18 12:33:40.278934519 +0000 UTC m=+1405.993335098" Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.315613 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-9bbxb" podStartSLOduration=3.315587958 podStartE2EDuration="3.315587958s" podCreationTimestamp="2026-03-18 12:33:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:40.30330321 +0000 UTC m=+1406.017703789" watchObservedRunningTime="2026-03-18 12:33:40.315587958 +0000 UTC m=+1406.029988537" Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.823267 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:40 crc kubenswrapper[4975]: I0318 12:33:40.879775 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75b95c5765-8zkk2"] Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.044212 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-ovsdbserver-nb\") pod \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.045000 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ws5x\" (UniqueName: \"kubernetes.io/projected/3b31a5d0-e54f-48c1-90a1-c83e32592b42-kube-api-access-6ws5x\") pod \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.045038 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-dns-svc\") pod \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.045150 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-dns-swift-storage-0\") pod \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.045175 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-config\") pod \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.045325 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-ovsdbserver-sb\") pod \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\" (UID: \"3b31a5d0-e54f-48c1-90a1-c83e32592b42\") " Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.070071 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b31a5d0-e54f-48c1-90a1-c83e32592b42-kube-api-access-6ws5x" (OuterVolumeSpecName: "kube-api-access-6ws5x") pod "3b31a5d0-e54f-48c1-90a1-c83e32592b42" (UID: "3b31a5d0-e54f-48c1-90a1-c83e32592b42"). InnerVolumeSpecName "kube-api-access-6ws5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.074283 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b31a5d0-e54f-48c1-90a1-c83e32592b42" (UID: "3b31a5d0-e54f-48c1-90a1-c83e32592b42"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.078419 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b31a5d0-e54f-48c1-90a1-c83e32592b42" (UID: "3b31a5d0-e54f-48c1-90a1-c83e32592b42"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.090110 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b31a5d0-e54f-48c1-90a1-c83e32592b42" (UID: "3b31a5d0-e54f-48c1-90a1-c83e32592b42"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.093889 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-config" (OuterVolumeSpecName: "config") pod "3b31a5d0-e54f-48c1-90a1-c83e32592b42" (UID: "3b31a5d0-e54f-48c1-90a1-c83e32592b42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.095688 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3b31a5d0-e54f-48c1-90a1-c83e32592b42" (UID: "3b31a5d0-e54f-48c1-90a1-c83e32592b42"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.147684 4975 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.147729 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.147743 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.147754 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.147764 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ws5x\" (UniqueName: \"kubernetes.io/projected/3b31a5d0-e54f-48c1-90a1-c83e32592b42-kube-api-access-6ws5x\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.147776 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b31a5d0-e54f-48c1-90a1-c83e32592b42-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.294482 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b95c5765-8zkk2" event={"ID":"ad41b5d2-6357-41a5-b934-01e7feb35657","Type":"ContainerStarted","Data":"7889ad01eb67f16962225003c3e87f1bb46376f1e02671f303988749dbecc654"} Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.319832 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3af7093-3a28-410f-b4e8-642fc96c0a41","Type":"ContainerStarted","Data":"91c4bf6bd778649112c78b84401fddd89b958ac4432ed61b3d99fcc5aa0c6ce3"} Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.326219 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" event={"ID":"effc30d8-1f26-461c-8555-86410a3f77ba","Type":"ContainerStarted","Data":"e2c46b51d17c388811ca2cf84335f0aff97529cd284600b76bfc32a76281329e"} Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.326291 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.329134 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" event={"ID":"3b31a5d0-e54f-48c1-90a1-c83e32592b42","Type":"ContainerDied","Data":"e4ec596aaa2192c9a889fe823bea76dea5b8d564a0ce76130cc24e8402cb7d55"} Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.329180 4975 scope.go:117] "RemoveContainer" containerID="308a061a773f9b4b27db6a6a53acca6b564dbd7683f8ee897ecb5df126abbf21" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.329933 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-r4j6c" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.367742 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" podStartSLOduration=4.367722787 podStartE2EDuration="4.367722787s" podCreationTimestamp="2026-03-18 12:33:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:41.36204326 +0000 UTC m=+1407.076443849" watchObservedRunningTime="2026-03-18 12:33:41.367722787 +0000 UTC m=+1407.082123366" Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.414150 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-r4j6c"] Mar 18 12:33:41 crc kubenswrapper[4975]: I0318 12:33:41.419645 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-r4j6c"] Mar 18 12:33:42 crc kubenswrapper[4975]: I0318 12:33:42.346683 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3af7093-3a28-410f-b4e8-642fc96c0a41","Type":"ContainerStarted","Data":"c6e581b84678d1d84f8f38afc1e35b4699b3cfe527e77639af4172bc9a98254e"} Mar 18 12:33:42 crc kubenswrapper[4975]: I0318 12:33:42.346939 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f3af7093-3a28-410f-b4e8-642fc96c0a41" containerName="glance-httpd" containerID="cri-o://c6e581b84678d1d84f8f38afc1e35b4699b3cfe527e77639af4172bc9a98254e" gracePeriod=30 Mar 18 12:33:42 crc kubenswrapper[4975]: I0318 12:33:42.346939 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f3af7093-3a28-410f-b4e8-642fc96c0a41" containerName="glance-log" containerID="cri-o://91c4bf6bd778649112c78b84401fddd89b958ac4432ed61b3d99fcc5aa0c6ce3" gracePeriod=30 Mar 18 12:33:42 crc kubenswrapper[4975]: I0318 12:33:42.353401 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ecd7b336-2830-4ce7-ae0c-12881a0c97fb" containerName="glance-log" containerID="cri-o://b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251" gracePeriod=30 Mar 18 12:33:42 crc kubenswrapper[4975]: I0318 12:33:42.353480 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ecd7b336-2830-4ce7-ae0c-12881a0c97fb","Type":"ContainerStarted","Data":"610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3"} Mar 18 12:33:42 crc kubenswrapper[4975]: I0318 12:33:42.353500 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ecd7b336-2830-4ce7-ae0c-12881a0c97fb","Type":"ContainerStarted","Data":"b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251"} Mar 18 12:33:42 crc kubenswrapper[4975]: I0318 12:33:42.353541 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ecd7b336-2830-4ce7-ae0c-12881a0c97fb" containerName="glance-httpd" containerID="cri-o://610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3" gracePeriod=30 Mar 18 12:33:42 crc kubenswrapper[4975]: I0318 12:33:42.421735 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.421708216 podStartE2EDuration="5.421708216s" podCreationTimestamp="2026-03-18 12:33:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:42.415513545 +0000 UTC m=+1408.129914134" watchObservedRunningTime="2026-03-18 12:33:42.421708216 +0000 UTC m=+1408.136108795" Mar 18 12:33:42 crc kubenswrapper[4975]: I0318 12:33:42.431078 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.431048903 podStartE2EDuration="5.431048903s" podCreationTimestamp="2026-03-18 12:33:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:33:42.388996965 +0000 UTC m=+1408.103397554" watchObservedRunningTime="2026-03-18 12:33:42.431048903 +0000 UTC m=+1408.145449482" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.044764 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b31a5d0-e54f-48c1-90a1-c83e32592b42" path="/var/lib/kubelet/pods/3b31a5d0-e54f-48c1-90a1-c83e32592b42/volumes" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.081492 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.207953 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.208058 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-internal-tls-certs\") pod \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.208143 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-config-data\") pod \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.208218 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpvwr\" (UniqueName: \"kubernetes.io/projected/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-kube-api-access-cpvwr\") pod \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.208243 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-httpd-run\") pod \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.208264 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-logs\") pod \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.208306 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-scripts\") pod \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.208345 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-combined-ca-bundle\") pod \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\" (UID: \"ecd7b336-2830-4ce7-ae0c-12881a0c97fb\") " Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.209276 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-logs" (OuterVolumeSpecName: "logs") pod "ecd7b336-2830-4ce7-ae0c-12881a0c97fb" (UID: "ecd7b336-2830-4ce7-ae0c-12881a0c97fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.209388 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ecd7b336-2830-4ce7-ae0c-12881a0c97fb" (UID: "ecd7b336-2830-4ce7-ae0c-12881a0c97fb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.211889 4975 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.211934 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.216195 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-scripts" (OuterVolumeSpecName: "scripts") pod "ecd7b336-2830-4ce7-ae0c-12881a0c97fb" (UID: "ecd7b336-2830-4ce7-ae0c-12881a0c97fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.235343 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-kube-api-access-cpvwr" (OuterVolumeSpecName: "kube-api-access-cpvwr") pod "ecd7b336-2830-4ce7-ae0c-12881a0c97fb" (UID: "ecd7b336-2830-4ce7-ae0c-12881a0c97fb"). InnerVolumeSpecName "kube-api-access-cpvwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.238803 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "ecd7b336-2830-4ce7-ae0c-12881a0c97fb" (UID: "ecd7b336-2830-4ce7-ae0c-12881a0c97fb"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.261213 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecd7b336-2830-4ce7-ae0c-12881a0c97fb" (UID: "ecd7b336-2830-4ce7-ae0c-12881a0c97fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.285236 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ecd7b336-2830-4ce7-ae0c-12881a0c97fb" (UID: "ecd7b336-2830-4ce7-ae0c-12881a0c97fb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.286787 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-config-data" (OuterVolumeSpecName: "config-data") pod "ecd7b336-2830-4ce7-ae0c-12881a0c97fb" (UID: "ecd7b336-2830-4ce7-ae0c-12881a0c97fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.313418 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.313457 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpvwr\" (UniqueName: \"kubernetes.io/projected/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-kube-api-access-cpvwr\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.313468 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.313476 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.313509 4975 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.313518 4975 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecd7b336-2830-4ce7-ae0c-12881a0c97fb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.346959 4975 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.371629 4975 generic.go:334] "Generic (PLEG): container finished" podID="f3af7093-3a28-410f-b4e8-642fc96c0a41" containerID="c6e581b84678d1d84f8f38afc1e35b4699b3cfe527e77639af4172bc9a98254e" exitCode=0 Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.371688 4975 generic.go:334] "Generic (PLEG): container finished" podID="f3af7093-3a28-410f-b4e8-642fc96c0a41" containerID="91c4bf6bd778649112c78b84401fddd89b958ac4432ed61b3d99fcc5aa0c6ce3" exitCode=143 Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.371793 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3af7093-3a28-410f-b4e8-642fc96c0a41","Type":"ContainerDied","Data":"c6e581b84678d1d84f8f38afc1e35b4699b3cfe527e77639af4172bc9a98254e"} Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.371905 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3af7093-3a28-410f-b4e8-642fc96c0a41","Type":"ContainerDied","Data":"91c4bf6bd778649112c78b84401fddd89b958ac4432ed61b3d99fcc5aa0c6ce3"} Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.377237 4975 generic.go:334] "Generic (PLEG): container finished" podID="ecd7b336-2830-4ce7-ae0c-12881a0c97fb" containerID="610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3" exitCode=143 Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.378763 4975 generic.go:334] "Generic (PLEG): container finished" podID="ecd7b336-2830-4ce7-ae0c-12881a0c97fb" containerID="b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251" exitCode=143 Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.377852 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.377800 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ecd7b336-2830-4ce7-ae0c-12881a0c97fb","Type":"ContainerDied","Data":"610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3"} Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.378946 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ecd7b336-2830-4ce7-ae0c-12881a0c97fb","Type":"ContainerDied","Data":"b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251"} Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.378980 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ecd7b336-2830-4ce7-ae0c-12881a0c97fb","Type":"ContainerDied","Data":"3b17c1d8fb30d7869b72fe9f266553dccee1396d34559f581ae6f771e5b253e1"} Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.379047 4975 scope.go:117] "RemoveContainer" containerID="610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.415002 4975 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.441082 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.461821 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.473493 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:43 crc kubenswrapper[4975]: E0318 12:33:43.474212 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd7b336-2830-4ce7-ae0c-12881a0c97fb" containerName="glance-log" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.474236 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd7b336-2830-4ce7-ae0c-12881a0c97fb" containerName="glance-log" Mar 18 12:33:43 crc kubenswrapper[4975]: E0318 12:33:43.474258 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd7b336-2830-4ce7-ae0c-12881a0c97fb" containerName="glance-httpd" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.474264 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd7b336-2830-4ce7-ae0c-12881a0c97fb" containerName="glance-httpd" Mar 18 12:33:43 crc kubenswrapper[4975]: E0318 12:33:43.474284 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b31a5d0-e54f-48c1-90a1-c83e32592b42" containerName="init" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.474290 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b31a5d0-e54f-48c1-90a1-c83e32592b42" containerName="init" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.474434 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b31a5d0-e54f-48c1-90a1-c83e32592b42" containerName="init" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.474459 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecd7b336-2830-4ce7-ae0c-12881a0c97fb" containerName="glance-log" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.474468 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecd7b336-2830-4ce7-ae0c-12881a0c97fb" containerName="glance-httpd" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.475431 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.477792 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.477991 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.488496 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.618643 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.618699 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.618725 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.618759 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.618780 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qqs7\" (UniqueName: \"kubernetes.io/projected/bfd49677-a36a-4e49-bdd9-4416f96c225e-kube-api-access-2qqs7\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.618806 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd49677-a36a-4e49-bdd9-4416f96c225e-logs\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.618877 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bfd49677-a36a-4e49-bdd9-4416f96c225e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.618906 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.722040 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.722117 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.722151 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.722181 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.722207 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qqs7\" (UniqueName: \"kubernetes.io/projected/bfd49677-a36a-4e49-bdd9-4416f96c225e-kube-api-access-2qqs7\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.722246 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd49677-a36a-4e49-bdd9-4416f96c225e-logs\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.722314 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bfd49677-a36a-4e49-bdd9-4416f96c225e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.722341 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.722832 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.724625 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bfd49677-a36a-4e49-bdd9-4416f96c225e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.724527 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd49677-a36a-4e49-bdd9-4416f96c225e-logs\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.728700 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.729040 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.732457 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.732732 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.745762 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qqs7\" (UniqueName: \"kubernetes.io/projected/bfd49677-a36a-4e49-bdd9-4416f96c225e-kube-api-access-2qqs7\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.754794 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:33:43 crc kubenswrapper[4975]: I0318 12:33:43.801404 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:33:44 crc kubenswrapper[4975]: I0318 12:33:44.390409 4975 generic.go:334] "Generic (PLEG): container finished" podID="142bd2ba-bbcb-4e91-9365-670839ea9f5a" containerID="df65c753ec03e658f829bd79d5281c49156d7666c60a74f064d6e8a1c6e09b15" exitCode=0 Mar 18 12:33:44 crc kubenswrapper[4975]: I0318 12:33:44.390613 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n9cbk" event={"ID":"142bd2ba-bbcb-4e91-9365-670839ea9f5a","Type":"ContainerDied","Data":"df65c753ec03e658f829bd79d5281c49156d7666c60a74f064d6e8a1c6e09b15"} Mar 18 12:33:45 crc kubenswrapper[4975]: I0318 12:33:45.036234 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecd7b336-2830-4ce7-ae0c-12881a0c97fb" path="/var/lib/kubelet/pods/ecd7b336-2830-4ce7-ae0c-12881a0c97fb/volumes" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.293543 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f78496f49-fklmw"] Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.345518 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-c478d4794-x2t7q"] Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.347017 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.351293 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.356567 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c478d4794-x2t7q"] Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.404244 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75b95c5765-8zkk2"] Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.435908 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-56bcd48494-744wv"] Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.437569 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.452650 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56bcd48494-744wv"] Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.461624 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.474409 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8f2d\" (UniqueName: \"kubernetes.io/projected/939992e3-94eb-4c98-a493-e30321c7f81a-kube-api-access-t8f2d\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.474463 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/939992e3-94eb-4c98-a493-e30321c7f81a-horizon-tls-certs\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.474498 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-config-data\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.474524 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-horizon-secret-key\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.474541 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/939992e3-94eb-4c98-a493-e30321c7f81a-logs\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.474577 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-horizon-tls-certs\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.474619 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/939992e3-94eb-4c98-a493-e30321c7f81a-scripts\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.474693 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/939992e3-94eb-4c98-a493-e30321c7f81a-config-data\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.474727 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/939992e3-94eb-4c98-a493-e30321c7f81a-horizon-secret-key\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.474769 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-combined-ca-bundle\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.474808 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sngk5\" (UniqueName: \"kubernetes.io/projected/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-kube-api-access-sngk5\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.474842 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-logs\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.474917 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-scripts\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.474951 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/939992e3-94eb-4c98-a493-e30321c7f81a-combined-ca-bundle\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.576925 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-combined-ca-bundle\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.576986 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sngk5\" (UniqueName: \"kubernetes.io/projected/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-kube-api-access-sngk5\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.577019 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-logs\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.577077 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-scripts\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.577107 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/939992e3-94eb-4c98-a493-e30321c7f81a-combined-ca-bundle\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.577172 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8f2d\" (UniqueName: \"kubernetes.io/projected/939992e3-94eb-4c98-a493-e30321c7f81a-kube-api-access-t8f2d\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.577209 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/939992e3-94eb-4c98-a493-e30321c7f81a-horizon-tls-certs\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.577230 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-config-data\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.577263 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-horizon-secret-key\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.577287 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/939992e3-94eb-4c98-a493-e30321c7f81a-logs\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.577314 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-horizon-tls-certs\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.577339 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/939992e3-94eb-4c98-a493-e30321c7f81a-scripts\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.577369 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/939992e3-94eb-4c98-a493-e30321c7f81a-config-data\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.577394 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/939992e3-94eb-4c98-a493-e30321c7f81a-horizon-secret-key\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.577642 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-logs\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.577884 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-scripts\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.578608 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/939992e3-94eb-4c98-a493-e30321c7f81a-logs\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.579100 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/939992e3-94eb-4c98-a493-e30321c7f81a-scripts\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.579365 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-config-data\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.579928 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/939992e3-94eb-4c98-a493-e30321c7f81a-config-data\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.583371 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-horizon-tls-certs\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.583527 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/939992e3-94eb-4c98-a493-e30321c7f81a-horizon-tls-certs\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.583855 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/939992e3-94eb-4c98-a493-e30321c7f81a-combined-ca-bundle\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.584097 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-horizon-secret-key\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.584853 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/939992e3-94eb-4c98-a493-e30321c7f81a-horizon-secret-key\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.596962 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-combined-ca-bundle\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.599519 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sngk5\" (UniqueName: \"kubernetes.io/projected/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-kube-api-access-sngk5\") pod \"horizon-c478d4794-x2t7q\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.600937 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8f2d\" (UniqueName: \"kubernetes.io/projected/939992e3-94eb-4c98-a493-e30321c7f81a-kube-api-access-t8f2d\") pod \"horizon-56bcd48494-744wv\" (UID: \"939992e3-94eb-4c98-a493-e30321c7f81a\") " pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.678702 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:33:46 crc kubenswrapper[4975]: I0318 12:33:46.756565 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:33:48 crc kubenswrapper[4975]: I0318 12:33:48.119918 4975 scope.go:117] "RemoveContainer" containerID="b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251" Mar 18 12:33:48 crc kubenswrapper[4975]: I0318 12:33:48.537034 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:33:48 crc kubenswrapper[4975]: I0318 12:33:48.589560 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-8wr26"] Mar 18 12:33:48 crc kubenswrapper[4975]: I0318 12:33:48.589790 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" podUID="42860e95-d4ae-438b-a858-a88e69058574" containerName="dnsmasq-dns" containerID="cri-o://3520ffb61ef42e02db934b0e54b706f21081f7d640fc5d002d78e673365101bf" gracePeriod=10 Mar 18 12:33:49 crc kubenswrapper[4975]: I0318 12:33:49.478913 4975 generic.go:334] "Generic (PLEG): container finished" podID="42860e95-d4ae-438b-a858-a88e69058574" containerID="3520ffb61ef42e02db934b0e54b706f21081f7d640fc5d002d78e673365101bf" exitCode=0 Mar 18 12:33:49 crc kubenswrapper[4975]: I0318 12:33:49.478969 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" event={"ID":"42860e95-d4ae-438b-a858-a88e69058574","Type":"ContainerDied","Data":"3520ffb61ef42e02db934b0e54b706f21081f7d640fc5d002d78e673365101bf"} Mar 18 12:33:49 crc kubenswrapper[4975]: I0318 12:33:49.604832 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" podUID="42860e95-d4ae-438b-a858-a88e69058574" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Mar 18 12:33:53 crc kubenswrapper[4975]: E0318 12:33:53.844053 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 18 12:33:53 crc kubenswrapper[4975]: E0318 12:33:53.844728 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ncfh688hc9h5fdh57bhd9h55h8hcfh8dh68chf7hd6h68h5h5b6h8dh57dhd6h597h555hf8h596hf9h5b4h67bh6bhc8h99h597h5dfh5d8q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qdckf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-8f5b85cf7-qxcrm_openstack(0e23cd99-78eb-4ab1-bfae-f436a8db17a6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:33:53 crc kubenswrapper[4975]: E0318 12:33:53.853416 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-8f5b85cf7-qxcrm" podUID="0e23cd99-78eb-4ab1-bfae-f436a8db17a6" Mar 18 12:33:53 crc kubenswrapper[4975]: I0318 12:33:53.961638 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.153102 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-public-tls-certs\") pod \"f3af7093-3a28-410f-b4e8-642fc96c0a41\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.153202 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-scripts\") pod \"f3af7093-3a28-410f-b4e8-642fc96c0a41\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.153224 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-config-data\") pod \"f3af7093-3a28-410f-b4e8-642fc96c0a41\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.153246 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3af7093-3a28-410f-b4e8-642fc96c0a41-httpd-run\") pod \"f3af7093-3a28-410f-b4e8-642fc96c0a41\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.153424 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3af7093-3a28-410f-b4e8-642fc96c0a41-logs\") pod \"f3af7093-3a28-410f-b4e8-642fc96c0a41\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.153458 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l9bh\" (UniqueName: \"kubernetes.io/projected/f3af7093-3a28-410f-b4e8-642fc96c0a41-kube-api-access-9l9bh\") pod \"f3af7093-3a28-410f-b4e8-642fc96c0a41\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.153493 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-combined-ca-bundle\") pod \"f3af7093-3a28-410f-b4e8-642fc96c0a41\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.154087 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"f3af7093-3a28-410f-b4e8-642fc96c0a41\" (UID: \"f3af7093-3a28-410f-b4e8-642fc96c0a41\") " Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.154564 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3af7093-3a28-410f-b4e8-642fc96c0a41-logs" (OuterVolumeSpecName: "logs") pod "f3af7093-3a28-410f-b4e8-642fc96c0a41" (UID: "f3af7093-3a28-410f-b4e8-642fc96c0a41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.154898 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3af7093-3a28-410f-b4e8-642fc96c0a41-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f3af7093-3a28-410f-b4e8-642fc96c0a41" (UID: "f3af7093-3a28-410f-b4e8-642fc96c0a41"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.160648 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "f3af7093-3a28-410f-b4e8-642fc96c0a41" (UID: "f3af7093-3a28-410f-b4e8-642fc96c0a41"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.163051 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3af7093-3a28-410f-b4e8-642fc96c0a41-kube-api-access-9l9bh" (OuterVolumeSpecName: "kube-api-access-9l9bh") pod "f3af7093-3a28-410f-b4e8-642fc96c0a41" (UID: "f3af7093-3a28-410f-b4e8-642fc96c0a41"). InnerVolumeSpecName "kube-api-access-9l9bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.168899 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-scripts" (OuterVolumeSpecName: "scripts") pod "f3af7093-3a28-410f-b4e8-642fc96c0a41" (UID: "f3af7093-3a28-410f-b4e8-642fc96c0a41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.198341 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3af7093-3a28-410f-b4e8-642fc96c0a41" (UID: "f3af7093-3a28-410f-b4e8-642fc96c0a41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.223583 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f3af7093-3a28-410f-b4e8-642fc96c0a41" (UID: "f3af7093-3a28-410f-b4e8-642fc96c0a41"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.227375 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-config-data" (OuterVolumeSpecName: "config-data") pod "f3af7093-3a28-410f-b4e8-642fc96c0a41" (UID: "f3af7093-3a28-410f-b4e8-642fc96c0a41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.256532 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l9bh\" (UniqueName: \"kubernetes.io/projected/f3af7093-3a28-410f-b4e8-642fc96c0a41-kube-api-access-9l9bh\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.256570 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.256605 4975 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.256616 4975 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.256624 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.256635 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3af7093-3a28-410f-b4e8-642fc96c0a41-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.256645 4975 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3af7093-3a28-410f-b4e8-642fc96c0a41-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.256654 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3af7093-3a28-410f-b4e8-642fc96c0a41-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.285764 4975 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.359798 4975 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.520669 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.529020 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3af7093-3a28-410f-b4e8-642fc96c0a41","Type":"ContainerDied","Data":"a66d59ae4f6e0fd7f6478011586decaacdb75d16e95652c8b124c3d2d103271a"} Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.588973 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.601622 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.604850 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" podUID="42860e95-d4ae-438b-a858-a88e69058574" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.613757 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:54 crc kubenswrapper[4975]: E0318 12:33:54.614534 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3af7093-3a28-410f-b4e8-642fc96c0a41" containerName="glance-httpd" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.614556 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3af7093-3a28-410f-b4e8-642fc96c0a41" containerName="glance-httpd" Mar 18 12:33:54 crc kubenswrapper[4975]: E0318 12:33:54.614609 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3af7093-3a28-410f-b4e8-642fc96c0a41" containerName="glance-log" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.614618 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3af7093-3a28-410f-b4e8-642fc96c0a41" containerName="glance-log" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.618256 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3af7093-3a28-410f-b4e8-642fc96c0a41" containerName="glance-log" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.618288 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3af7093-3a28-410f-b4e8-642fc96c0a41" containerName="glance-httpd" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.623735 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.626972 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.627961 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.631699 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.775508 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.775560 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn9rm\" (UniqueName: \"kubernetes.io/projected/71ec6dfe-9c62-4027-a59f-fc13c24dd809-kube-api-access-pn9rm\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.775633 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.775652 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.775673 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71ec6dfe-9c62-4027-a59f-fc13c24dd809-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.775718 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-scripts\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.775740 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71ec6dfe-9c62-4027-a59f-fc13c24dd809-logs\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.775824 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-config-data\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.877665 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.877711 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.877735 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71ec6dfe-9c62-4027-a59f-fc13c24dd809-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.877781 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-scripts\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.877805 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71ec6dfe-9c62-4027-a59f-fc13c24dd809-logs\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.877917 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-config-data\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.877964 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.877986 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn9rm\" (UniqueName: \"kubernetes.io/projected/71ec6dfe-9c62-4027-a59f-fc13c24dd809-kube-api-access-pn9rm\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.878896 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.880172 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71ec6dfe-9c62-4027-a59f-fc13c24dd809-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.880288 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71ec6dfe-9c62-4027-a59f-fc13c24dd809-logs\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.883184 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.885143 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-config-data\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.889014 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.895646 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn9rm\" (UniqueName: \"kubernetes.io/projected/71ec6dfe-9c62-4027-a59f-fc13c24dd809-kube-api-access-pn9rm\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.895745 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-scripts\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.909634 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " pod="openstack/glance-default-external-api-0" Mar 18 12:33:54 crc kubenswrapper[4975]: I0318 12:33:54.950443 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:33:55 crc kubenswrapper[4975]: I0318 12:33:55.045330 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3af7093-3a28-410f-b4e8-642fc96c0a41" path="/var/lib/kubelet/pods/f3af7093-3a28-410f-b4e8-642fc96c0a41/volumes" Mar 18 12:33:57 crc kubenswrapper[4975]: E0318 12:33:57.722123 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 18 12:33:57 crc kubenswrapper[4975]: E0318 12:33:57.722640 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbh684h54dh5dh57bh664h556h668h578hdh658hf8h58fh56bh74h5cdh679h59h695h67h59fh65bh64dh94h6h5bh5f8h648hdbh58dh5h66bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vmctd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-75b95c5765-8zkk2_openstack(ad41b5d2-6357-41a5-b934-01e7feb35657): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:33:57 crc kubenswrapper[4975]: E0318 12:33:57.725903 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-75b95c5765-8zkk2" podUID="ad41b5d2-6357-41a5-b934-01e7feb35657" Mar 18 12:33:57 crc kubenswrapper[4975]: I0318 12:33:57.804440 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:57 crc kubenswrapper[4975]: I0318 12:33:57.943770 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-fernet-keys\") pod \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " Mar 18 12:33:57 crc kubenswrapper[4975]: I0318 12:33:57.943939 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-combined-ca-bundle\") pod \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " Mar 18 12:33:57 crc kubenswrapper[4975]: I0318 12:33:57.943987 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-credential-keys\") pod \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " Mar 18 12:33:57 crc kubenswrapper[4975]: I0318 12:33:57.944020 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-scripts\") pod \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " Mar 18 12:33:57 crc kubenswrapper[4975]: I0318 12:33:57.944115 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-config-data\") pod \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " Mar 18 12:33:57 crc kubenswrapper[4975]: I0318 12:33:57.944146 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9qt5\" (UniqueName: \"kubernetes.io/projected/142bd2ba-bbcb-4e91-9365-670839ea9f5a-kube-api-access-c9qt5\") pod \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\" (UID: \"142bd2ba-bbcb-4e91-9365-670839ea9f5a\") " Mar 18 12:33:57 crc kubenswrapper[4975]: I0318 12:33:57.951124 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "142bd2ba-bbcb-4e91-9365-670839ea9f5a" (UID: "142bd2ba-bbcb-4e91-9365-670839ea9f5a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:57 crc kubenswrapper[4975]: I0318 12:33:57.956236 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "142bd2ba-bbcb-4e91-9365-670839ea9f5a" (UID: "142bd2ba-bbcb-4e91-9365-670839ea9f5a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:57 crc kubenswrapper[4975]: I0318 12:33:57.956239 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-scripts" (OuterVolumeSpecName: "scripts") pod "142bd2ba-bbcb-4e91-9365-670839ea9f5a" (UID: "142bd2ba-bbcb-4e91-9365-670839ea9f5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:57 crc kubenswrapper[4975]: I0318 12:33:57.957860 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/142bd2ba-bbcb-4e91-9365-670839ea9f5a-kube-api-access-c9qt5" (OuterVolumeSpecName: "kube-api-access-c9qt5") pod "142bd2ba-bbcb-4e91-9365-670839ea9f5a" (UID: "142bd2ba-bbcb-4e91-9365-670839ea9f5a"). InnerVolumeSpecName "kube-api-access-c9qt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:33:57 crc kubenswrapper[4975]: I0318 12:33:57.968735 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-config-data" (OuterVolumeSpecName: "config-data") pod "142bd2ba-bbcb-4e91-9365-670839ea9f5a" (UID: "142bd2ba-bbcb-4e91-9365-670839ea9f5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:57 crc kubenswrapper[4975]: I0318 12:33:57.969550 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "142bd2ba-bbcb-4e91-9365-670839ea9f5a" (UID: "142bd2ba-bbcb-4e91-9365-670839ea9f5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.046582 4975 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.046621 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.046637 4975 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.046648 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.046657 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142bd2ba-bbcb-4e91-9365-670839ea9f5a-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.046669 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9qt5\" (UniqueName: \"kubernetes.io/projected/142bd2ba-bbcb-4e91-9365-670839ea9f5a-kube-api-access-c9qt5\") on node \"crc\" DevicePath \"\"" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.551097 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n9cbk" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.551082 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n9cbk" event={"ID":"142bd2ba-bbcb-4e91-9365-670839ea9f5a","Type":"ContainerDied","Data":"fcbbcd1c6a09cf797bc86a46831b977b029b666d71afade3183bd4595afcd59b"} Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.551459 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcbbcd1c6a09cf797bc86a46831b977b029b666d71afade3183bd4595afcd59b" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.902836 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-n9cbk"] Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.910129 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-n9cbk"] Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.984056 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wfp2q"] Mar 18 12:33:58 crc kubenswrapper[4975]: E0318 12:33:58.984527 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="142bd2ba-bbcb-4e91-9365-670839ea9f5a" containerName="keystone-bootstrap" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.984550 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="142bd2ba-bbcb-4e91-9365-670839ea9f5a" containerName="keystone-bootstrap" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.984748 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="142bd2ba-bbcb-4e91-9365-670839ea9f5a" containerName="keystone-bootstrap" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.985480 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.987780 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d6c9r" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.987826 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.987847 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.989518 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.989723 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 12:33:58 crc kubenswrapper[4975]: I0318 12:33:58.993493 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wfp2q"] Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.032445 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="142bd2ba-bbcb-4e91-9365-670839ea9f5a" path="/var/lib/kubelet/pods/142bd2ba-bbcb-4e91-9365-670839ea9f5a/volumes" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.171820 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-credential-keys\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.171907 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-combined-ca-bundle\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.171986 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpmgz\" (UniqueName: \"kubernetes.io/projected/3247b82c-5928-4a38-85aa-d37c7d8f6c21-kube-api-access-wpmgz\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.172043 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-config-data\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.172084 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-scripts\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.172288 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-fernet-keys\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.274114 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-fernet-keys\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.274241 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-credential-keys\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.274267 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-combined-ca-bundle\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.274317 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpmgz\" (UniqueName: \"kubernetes.io/projected/3247b82c-5928-4a38-85aa-d37c7d8f6c21-kube-api-access-wpmgz\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.274374 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-config-data\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.274431 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-scripts\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.280010 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-combined-ca-bundle\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.280611 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-credential-keys\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.281587 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-config-data\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.281928 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-fernet-keys\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.285310 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-scripts\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.297653 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpmgz\" (UniqueName: \"kubernetes.io/projected/3247b82c-5928-4a38-85aa-d37c7d8f6c21-kube-api-access-wpmgz\") pod \"keystone-bootstrap-wfp2q\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.305147 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.604728 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" podUID="42860e95-d4ae-438b-a858-a88e69058574" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Mar 18 12:33:59 crc kubenswrapper[4975]: I0318 12:33:59.604853 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:34:00 crc kubenswrapper[4975]: I0318 12:34:00.138778 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563954-6tpj8"] Mar 18 12:34:00 crc kubenswrapper[4975]: I0318 12:34:00.140378 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563954-6tpj8" Mar 18 12:34:00 crc kubenswrapper[4975]: I0318 12:34:00.143293 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:34:00 crc kubenswrapper[4975]: I0318 12:34:00.143501 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:34:00 crc kubenswrapper[4975]: I0318 12:34:00.144460 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:34:00 crc kubenswrapper[4975]: I0318 12:34:00.148964 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563954-6tpj8"] Mar 18 12:34:00 crc kubenswrapper[4975]: I0318 12:34:00.291972 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms6rm\" (UniqueName: \"kubernetes.io/projected/e746cdd9-ae57-4577-a9c5-eafc0aa28c09-kube-api-access-ms6rm\") pod \"auto-csr-approver-29563954-6tpj8\" (UID: \"e746cdd9-ae57-4577-a9c5-eafc0aa28c09\") " pod="openshift-infra/auto-csr-approver-29563954-6tpj8" Mar 18 12:34:00 crc kubenswrapper[4975]: I0318 12:34:00.394074 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms6rm\" (UniqueName: \"kubernetes.io/projected/e746cdd9-ae57-4577-a9c5-eafc0aa28c09-kube-api-access-ms6rm\") pod \"auto-csr-approver-29563954-6tpj8\" (UID: \"e746cdd9-ae57-4577-a9c5-eafc0aa28c09\") " pod="openshift-infra/auto-csr-approver-29563954-6tpj8" Mar 18 12:34:00 crc kubenswrapper[4975]: I0318 12:34:00.419778 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms6rm\" (UniqueName: \"kubernetes.io/projected/e746cdd9-ae57-4577-a9c5-eafc0aa28c09-kube-api-access-ms6rm\") pod \"auto-csr-approver-29563954-6tpj8\" (UID: \"e746cdd9-ae57-4577-a9c5-eafc0aa28c09\") " pod="openshift-infra/auto-csr-approver-29563954-6tpj8" Mar 18 12:34:00 crc kubenswrapper[4975]: I0318 12:34:00.475150 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563954-6tpj8" Mar 18 12:34:02 crc kubenswrapper[4975]: I0318 12:34:02.581573 4975 generic.go:334] "Generic (PLEG): container finished" podID="f85a1328-170d-4be7-8115-36d400fd7645" containerID="fcf9119d0dd51cd281c1f3e4788c2924109567cae2489841d9b989aeb57f0389" exitCode=0 Mar 18 12:34:02 crc kubenswrapper[4975]: I0318 12:34:02.581920 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9bbxb" event={"ID":"f85a1328-170d-4be7-8115-36d400fd7645","Type":"ContainerDied","Data":"fcf9119d0dd51cd281c1f3e4788c2924109567cae2489841d9b989aeb57f0389"} Mar 18 12:34:04 crc kubenswrapper[4975]: I0318 12:34:04.605047 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" podUID="42860e95-d4ae-438b-a858-a88e69058574" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.249885 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.415350 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-logs\") pod \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.415441 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-scripts\") pod \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.415520 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-horizon-secret-key\") pod \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.415652 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-config-data\") pod \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.415694 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdckf\" (UniqueName: \"kubernetes.io/projected/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-kube-api-access-qdckf\") pod \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\" (UID: \"0e23cd99-78eb-4ab1-bfae-f436a8db17a6\") " Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.415789 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-logs" (OuterVolumeSpecName: "logs") pod "0e23cd99-78eb-4ab1-bfae-f436a8db17a6" (UID: "0e23cd99-78eb-4ab1-bfae-f436a8db17a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.416119 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-scripts" (OuterVolumeSpecName: "scripts") pod "0e23cd99-78eb-4ab1-bfae-f436a8db17a6" (UID: "0e23cd99-78eb-4ab1-bfae-f436a8db17a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.416129 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.416309 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-config-data" (OuterVolumeSpecName: "config-data") pod "0e23cd99-78eb-4ab1-bfae-f436a8db17a6" (UID: "0e23cd99-78eb-4ab1-bfae-f436a8db17a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.423429 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-kube-api-access-qdckf" (OuterVolumeSpecName: "kube-api-access-qdckf") pod "0e23cd99-78eb-4ab1-bfae-f436a8db17a6" (UID: "0e23cd99-78eb-4ab1-bfae-f436a8db17a6"). InnerVolumeSpecName "kube-api-access-qdckf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.423547 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0e23cd99-78eb-4ab1-bfae-f436a8db17a6" (UID: "0e23cd99-78eb-4ab1-bfae-f436a8db17a6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.517285 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdckf\" (UniqueName: \"kubernetes.io/projected/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-kube-api-access-qdckf\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.517329 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.517340 4975 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.517348 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e23cd99-78eb-4ab1-bfae-f436a8db17a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.613414 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8f5b85cf7-qxcrm" event={"ID":"0e23cd99-78eb-4ab1-bfae-f436a8db17a6","Type":"ContainerDied","Data":"37a44b1a0bcbc3ddf9a35bb0d1e6ffaaaf040b1dacd72515d009cc7573a9814f"} Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.613633 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8f5b85cf7-qxcrm" Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.677436 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8f5b85cf7-qxcrm"] Mar 18 12:34:06 crc kubenswrapper[4975]: I0318 12:34:06.687450 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8f5b85cf7-qxcrm"] Mar 18 12:34:06 crc kubenswrapper[4975]: E0318 12:34:06.749113 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 18 12:34:06 crc kubenswrapper[4975]: E0318 12:34:06.749321 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cmps4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-psw5k_openstack(4f054ebc-151c-4e89-8242-3837c9bee6b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:34:06 crc kubenswrapper[4975]: E0318 12:34:06.750486 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-psw5k" podUID="4f054ebc-151c-4e89-8242-3837c9bee6b2" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.028417 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e23cd99-78eb-4ab1-bfae-f436a8db17a6" path="/var/lib/kubelet/pods/0e23cd99-78eb-4ab1-bfae-f436a8db17a6/volumes" Mar 18 12:34:07 crc kubenswrapper[4975]: E0318 12:34:07.079800 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 18 12:34:07 crc kubenswrapper[4975]: E0318 12:34:07.079977 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n67hc8hbbh694h674h599h5dbh6ch58bh68fh655h5cdh5bdh9bh5h5ch87h97h5c5h584h8dh67bh97h5f6h598hch5c8h8fh5c6h57dh87h5dq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nhb2m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(b8ef7f51-157e-4b7c-95ce-7d8655c5c78d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.093377 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9bbxb" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.097643 4975 scope.go:117] "RemoveContainer" containerID="610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3" Mar 18 12:34:07 crc kubenswrapper[4975]: E0318 12:34:07.098153 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3\": container with ID starting with 610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3 not found: ID does not exist" containerID="610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.098183 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3"} err="failed to get container status \"610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3\": rpc error: code = NotFound desc = could not find container \"610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3\": container with ID starting with 610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3 not found: ID does not exist" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.098242 4975 scope.go:117] "RemoveContainer" containerID="b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251" Mar 18 12:34:07 crc kubenswrapper[4975]: E0318 12:34:07.098629 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251\": container with ID starting with b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251 not found: ID does not exist" containerID="b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.098675 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251"} err="failed to get container status \"b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251\": rpc error: code = NotFound desc = could not find container \"b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251\": container with ID starting with b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251 not found: ID does not exist" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.098705 4975 scope.go:117] "RemoveContainer" containerID="610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.099276 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3"} err="failed to get container status \"610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3\": rpc error: code = NotFound desc = could not find container \"610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3\": container with ID starting with 610ccf84df225541365395355df7c566eae1507467e56e2b20ab0f4816daa0a3 not found: ID does not exist" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.099301 4975 scope.go:117] "RemoveContainer" containerID="b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.099626 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.100241 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251"} err="failed to get container status \"b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251\": rpc error: code = NotFound desc = could not find container \"b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251\": container with ID starting with b4bb8fc806ce512e185411477c1a30f260544e70d20fbf8f0daeab872a410251 not found: ID does not exist" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.100266 4975 scope.go:117] "RemoveContainer" containerID="c6e581b84678d1d84f8f38afc1e35b4699b3cfe527e77639af4172bc9a98254e" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.228321 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ad41b5d2-6357-41a5-b934-01e7feb35657-horizon-secret-key\") pod \"ad41b5d2-6357-41a5-b934-01e7feb35657\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.228547 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad41b5d2-6357-41a5-b934-01e7feb35657-logs\") pod \"ad41b5d2-6357-41a5-b934-01e7feb35657\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.228598 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad41b5d2-6357-41a5-b934-01e7feb35657-config-data\") pod \"ad41b5d2-6357-41a5-b934-01e7feb35657\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.228621 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmctd\" (UniqueName: \"kubernetes.io/projected/ad41b5d2-6357-41a5-b934-01e7feb35657-kube-api-access-vmctd\") pod \"ad41b5d2-6357-41a5-b934-01e7feb35657\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.228687 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad41b5d2-6357-41a5-b934-01e7feb35657-scripts\") pod \"ad41b5d2-6357-41a5-b934-01e7feb35657\" (UID: \"ad41b5d2-6357-41a5-b934-01e7feb35657\") " Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.228798 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f85a1328-170d-4be7-8115-36d400fd7645-config\") pod \"f85a1328-170d-4be7-8115-36d400fd7645\" (UID: \"f85a1328-170d-4be7-8115-36d400fd7645\") " Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.228841 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgtlc\" (UniqueName: \"kubernetes.io/projected/f85a1328-170d-4be7-8115-36d400fd7645-kube-api-access-pgtlc\") pod \"f85a1328-170d-4be7-8115-36d400fd7645\" (UID: \"f85a1328-170d-4be7-8115-36d400fd7645\") " Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.228876 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85a1328-170d-4be7-8115-36d400fd7645-combined-ca-bundle\") pod \"f85a1328-170d-4be7-8115-36d400fd7645\" (UID: \"f85a1328-170d-4be7-8115-36d400fd7645\") " Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.229488 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad41b5d2-6357-41a5-b934-01e7feb35657-scripts" (OuterVolumeSpecName: "scripts") pod "ad41b5d2-6357-41a5-b934-01e7feb35657" (UID: "ad41b5d2-6357-41a5-b934-01e7feb35657"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.229629 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad41b5d2-6357-41a5-b934-01e7feb35657-logs" (OuterVolumeSpecName: "logs") pod "ad41b5d2-6357-41a5-b934-01e7feb35657" (UID: "ad41b5d2-6357-41a5-b934-01e7feb35657"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.229638 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad41b5d2-6357-41a5-b934-01e7feb35657-config-data" (OuterVolumeSpecName: "config-data") pod "ad41b5d2-6357-41a5-b934-01e7feb35657" (UID: "ad41b5d2-6357-41a5-b934-01e7feb35657"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.233076 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad41b5d2-6357-41a5-b934-01e7feb35657-kube-api-access-vmctd" (OuterVolumeSpecName: "kube-api-access-vmctd") pod "ad41b5d2-6357-41a5-b934-01e7feb35657" (UID: "ad41b5d2-6357-41a5-b934-01e7feb35657"). InnerVolumeSpecName "kube-api-access-vmctd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.234260 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad41b5d2-6357-41a5-b934-01e7feb35657-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ad41b5d2-6357-41a5-b934-01e7feb35657" (UID: "ad41b5d2-6357-41a5-b934-01e7feb35657"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.253073 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f85a1328-170d-4be7-8115-36d400fd7645-kube-api-access-pgtlc" (OuterVolumeSpecName: "kube-api-access-pgtlc") pod "f85a1328-170d-4be7-8115-36d400fd7645" (UID: "f85a1328-170d-4be7-8115-36d400fd7645"). InnerVolumeSpecName "kube-api-access-pgtlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.253075 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85a1328-170d-4be7-8115-36d400fd7645-config" (OuterVolumeSpecName: "config") pod "f85a1328-170d-4be7-8115-36d400fd7645" (UID: "f85a1328-170d-4be7-8115-36d400fd7645"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.254831 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85a1328-170d-4be7-8115-36d400fd7645-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f85a1328-170d-4be7-8115-36d400fd7645" (UID: "f85a1328-170d-4be7-8115-36d400fd7645"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.331191 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f85a1328-170d-4be7-8115-36d400fd7645-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.331222 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgtlc\" (UniqueName: \"kubernetes.io/projected/f85a1328-170d-4be7-8115-36d400fd7645-kube-api-access-pgtlc\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.331236 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85a1328-170d-4be7-8115-36d400fd7645-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.331245 4975 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ad41b5d2-6357-41a5-b934-01e7feb35657-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.331254 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad41b5d2-6357-41a5-b934-01e7feb35657-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.331263 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad41b5d2-6357-41a5-b934-01e7feb35657-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.331272 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmctd\" (UniqueName: \"kubernetes.io/projected/ad41b5d2-6357-41a5-b934-01e7feb35657-kube-api-access-vmctd\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.331280 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad41b5d2-6357-41a5-b934-01e7feb35657-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.623787 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b95c5765-8zkk2" event={"ID":"ad41b5d2-6357-41a5-b934-01e7feb35657","Type":"ContainerDied","Data":"7889ad01eb67f16962225003c3e87f1bb46376f1e02671f303988749dbecc654"} Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.623813 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b95c5765-8zkk2" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.625453 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9bbxb" event={"ID":"f85a1328-170d-4be7-8115-36d400fd7645","Type":"ContainerDied","Data":"2c4f2901057d484d6b93900fac83bb316e9055d02a839f4ada73abe4fe933af2"} Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.625486 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c4f2901057d484d6b93900fac83bb316e9055d02a839f4ada73abe4fe933af2" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.625514 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9bbxb" Mar 18 12:34:07 crc kubenswrapper[4975]: E0318 12:34:07.627416 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-psw5k" podUID="4f054ebc-151c-4e89-8242-3837c9bee6b2" Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.698683 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75b95c5765-8zkk2"] Mar 18 12:34:07 crc kubenswrapper[4975]: I0318 12:34:07.706792 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-75b95c5765-8zkk2"] Mar 18 12:34:08 crc kubenswrapper[4975]: E0318 12:34:08.350940 4975 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 18 12:34:08 crc kubenswrapper[4975]: E0318 12:34:08.351336 4975 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-64g7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-m6lmd_openstack(94f94b61-6738-4ab8-a65f-0d6cf4d86be1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 12:34:08 crc kubenswrapper[4975]: E0318 12:34:08.360406 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-m6lmd" podUID="94f94b61-6738-4ab8-a65f-0d6cf4d86be1" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.363415 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h4v54"] Mar 18 12:34:08 crc kubenswrapper[4975]: E0318 12:34:08.368281 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85a1328-170d-4be7-8115-36d400fd7645" containerName="neutron-db-sync" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.368307 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85a1328-170d-4be7-8115-36d400fd7645" containerName="neutron-db-sync" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.368522 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85a1328-170d-4be7-8115-36d400fd7645" containerName="neutron-db-sync" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.369415 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.390072 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h4v54"] Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.460057 4975 scope.go:117] "RemoveContainer" containerID="91c4bf6bd778649112c78b84401fddd89b958ac4432ed61b3d99fcc5aa0c6ce3" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.520280 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5899b99ff6-cwt84"] Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.523086 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.526373 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.526906 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.527559 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.527770 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sstlq" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.547166 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.575000 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.575097 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.575138 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-dns-svc\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.575204 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.575238 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-config\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.575295 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kgfs\" (UniqueName: \"kubernetes.io/projected/0b6f390f-605a-4717-afea-39913c57679d-kube-api-access-7kgfs\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.598300 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5899b99ff6-cwt84"] Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.675647 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-dns-swift-storage-0\") pod \"42860e95-d4ae-438b-a858-a88e69058574\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.675990 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-dns-svc\") pod \"42860e95-d4ae-438b-a858-a88e69058574\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.676755 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-ovsdbserver-nb\") pod \"42860e95-d4ae-438b-a858-a88e69058574\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.677086 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-687lg\" (UniqueName: \"kubernetes.io/projected/42860e95-d4ae-438b-a858-a88e69058574-kube-api-access-687lg\") pod \"42860e95-d4ae-438b-a858-a88e69058574\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.677164 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-config\") pod \"42860e95-d4ae-438b-a858-a88e69058574\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.677244 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-ovsdbserver-sb\") pod \"42860e95-d4ae-438b-a858-a88e69058574\" (UID: \"42860e95-d4ae-438b-a858-a88e69058574\") " Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.678022 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9n6h\" (UniqueName: \"kubernetes.io/projected/48f1daf8-3604-40e4-9e41-e9025c083c7d-kube-api-access-t9n6h\") pod \"neutron-5899b99ff6-cwt84\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.678198 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.678349 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.678387 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-httpd-config\") pod \"neutron-5899b99ff6-cwt84\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.678462 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-dns-svc\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.678583 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-config\") pod \"neutron-5899b99ff6-cwt84\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.678607 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.678683 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-config\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.679978 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.680658 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-config\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.683218 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.683403 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-ovndb-tls-certs\") pod \"neutron-5899b99ff6-cwt84\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.683555 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kgfs\" (UniqueName: \"kubernetes.io/projected/0b6f390f-605a-4717-afea-39913c57679d-kube-api-access-7kgfs\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.683582 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-combined-ca-bundle\") pod \"neutron-5899b99ff6-cwt84\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.686699 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42860e95-d4ae-438b-a858-a88e69058574-kube-api-access-687lg" (OuterVolumeSpecName: "kube-api-access-687lg") pod "42860e95-d4ae-438b-a858-a88e69058574" (UID: "42860e95-d4ae-438b-a858-a88e69058574"). InnerVolumeSpecName "kube-api-access-687lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.691520 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.697701 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.697955 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-8wr26" event={"ID":"42860e95-d4ae-438b-a858-a88e69058574","Type":"ContainerDied","Data":"493e45eea83cd75bf3afb807e619c71f7d80307f6ffd452c8d71fdc7c4f675bf"} Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.698073 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-dns-svc\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: E0318 12:34:08.744324 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-m6lmd" podUID="94f94b61-6738-4ab8-a65f-0d6cf4d86be1" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.745526 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-config" (OuterVolumeSpecName: "config") pod "42860e95-d4ae-438b-a858-a88e69058574" (UID: "42860e95-d4ae-438b-a858-a88e69058574"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.749619 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kgfs\" (UniqueName: \"kubernetes.io/projected/0b6f390f-605a-4717-afea-39913c57679d-kube-api-access-7kgfs\") pod \"dnsmasq-dns-55f844cf75-h4v54\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.785724 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-ovndb-tls-certs\") pod \"neutron-5899b99ff6-cwt84\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.785769 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-combined-ca-bundle\") pod \"neutron-5899b99ff6-cwt84\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.785802 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9n6h\" (UniqueName: \"kubernetes.io/projected/48f1daf8-3604-40e4-9e41-e9025c083c7d-kube-api-access-t9n6h\") pod \"neutron-5899b99ff6-cwt84\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.785894 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-httpd-config\") pod \"neutron-5899b99ff6-cwt84\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.785948 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-config\") pod \"neutron-5899b99ff6-cwt84\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.786000 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.786012 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-687lg\" (UniqueName: \"kubernetes.io/projected/42860e95-d4ae-438b-a858-a88e69058574-kube-api-access-687lg\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.799576 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-httpd-config\") pod \"neutron-5899b99ff6-cwt84\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.808750 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "42860e95-d4ae-438b-a858-a88e69058574" (UID: "42860e95-d4ae-438b-a858-a88e69058574"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.809100 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "42860e95-d4ae-438b-a858-a88e69058574" (UID: "42860e95-d4ae-438b-a858-a88e69058574"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.809413 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "42860e95-d4ae-438b-a858-a88e69058574" (UID: "42860e95-d4ae-438b-a858-a88e69058574"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.810703 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-ovndb-tls-certs\") pod \"neutron-5899b99ff6-cwt84\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.823533 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-config\") pod \"neutron-5899b99ff6-cwt84\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.824582 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-combined-ca-bundle\") pod \"neutron-5899b99ff6-cwt84\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.824622 4975 scope.go:117] "RemoveContainer" containerID="3520ffb61ef42e02db934b0e54b706f21081f7d640fc5d002d78e673365101bf" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.831935 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9n6h\" (UniqueName: \"kubernetes.io/projected/48f1daf8-3604-40e4-9e41-e9025c083c7d-kube-api-access-t9n6h\") pod \"neutron-5899b99ff6-cwt84\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.854380 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "42860e95-d4ae-438b-a858-a88e69058574" (UID: "42860e95-d4ae-438b-a858-a88e69058574"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.873233 4975 scope.go:117] "RemoveContainer" containerID="834759c41c13b167fbfb55bdc13c1d44699516e10324abbee15606cb67beb04d" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.878640 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.896316 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.896350 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.896359 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.896368 4975 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42860e95-d4ae-438b-a858-a88e69058574-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:08 crc kubenswrapper[4975]: I0318 12:34:08.975454 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.035698 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad41b5d2-6357-41a5-b934-01e7feb35657" path="/var/lib/kubelet/pods/ad41b5d2-6357-41a5-b934-01e7feb35657/volumes" Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.076345 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-8wr26"] Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.095187 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-8wr26"] Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.122787 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c478d4794-x2t7q"] Mar 18 12:34:09 crc kubenswrapper[4975]: W0318 12:34:09.135550 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff4782d5_d7a3_47ee_af16_ccef8ed4cdd7.slice/crio-d6be8c15f83546a47280b98be94796f59aef29b9de4dc0a10bfdb7458543f3a5 WatchSource:0}: Error finding container d6be8c15f83546a47280b98be94796f59aef29b9de4dc0a10bfdb7458543f3a5: Status 404 returned error can't find the container with id d6be8c15f83546a47280b98be94796f59aef29b9de4dc0a10bfdb7458543f3a5 Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.457067 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.509589 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563954-6tpj8"] Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.530305 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56bcd48494-744wv"] Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.559587 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wfp2q"] Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.626853 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h4v54"] Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.668172 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.724802 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f78496f49-fklmw" event={"ID":"06e8f21f-8218-4a64-a302-0f0b8193b9c8","Type":"ContainerStarted","Data":"77f2d72d2d7935b0002206b97a232de84e6d33a8a1d5a348b2bef72b0329527d"} Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.724907 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f78496f49-fklmw" event={"ID":"06e8f21f-8218-4a64-a302-0f0b8193b9c8","Type":"ContainerStarted","Data":"36b15be87c7997046d29b661b13f902f84ae0523ac0c792ceb21949c58be4356"} Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.724901 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f78496f49-fklmw" podUID="06e8f21f-8218-4a64-a302-0f0b8193b9c8" containerName="horizon-log" containerID="cri-o://36b15be87c7997046d29b661b13f902f84ae0523ac0c792ceb21949c58be4356" gracePeriod=30 Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.725048 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5899b99ff6-cwt84"] Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.725046 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f78496f49-fklmw" podUID="06e8f21f-8218-4a64-a302-0f0b8193b9c8" containerName="horizon" containerID="cri-o://77f2d72d2d7935b0002206b97a232de84e6d33a8a1d5a348b2bef72b0329527d" gracePeriod=30 Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.730818 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c478d4794-x2t7q" event={"ID":"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7","Type":"ContainerStarted","Data":"6b28ba84cab9eb6dd833f8a2c43c58ed4371a9c34e96c677b896512ab2e431d3"} Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.730885 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c478d4794-x2t7q" event={"ID":"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7","Type":"ContainerStarted","Data":"d6be8c15f83546a47280b98be94796f59aef29b9de4dc0a10bfdb7458543f3a5"} Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.737709 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hwtrk" event={"ID":"c2614112-7379-4588-a6dd-1cec6e3d96b4","Type":"ContainerStarted","Data":"be07b1fb2431f75c390014aa13497026e975f154b46b324a8a5e1a634690363f"} Mar 18 12:34:09 crc kubenswrapper[4975]: I0318 12:34:09.756814 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-f78496f49-fklmw" podStartSLOduration=3.7985799719999997 podStartE2EDuration="32.756791481s" podCreationTimestamp="2026-03-18 12:33:37 +0000 UTC" firstStartedPulling="2026-03-18 12:33:39.327517136 +0000 UTC m=+1405.041917715" lastFinishedPulling="2026-03-18 12:34:08.285728645 +0000 UTC m=+1434.000129224" observedRunningTime="2026-03-18 12:34:09.743764752 +0000 UTC m=+1435.458165331" watchObservedRunningTime="2026-03-18 12:34:09.756791481 +0000 UTC m=+1435.471192060" Mar 18 12:34:09 crc kubenswrapper[4975]: W0318 12:34:09.874920 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3247b82c_5928_4a38_85aa_d37c7d8f6c21.slice/crio-31ac43e79ad64382efae907ab4b56e6cfdece59822e25e48ede8957117065cdd WatchSource:0}: Error finding container 31ac43e79ad64382efae907ab4b56e6cfdece59822e25e48ede8957117065cdd: Status 404 returned error can't find the container with id 31ac43e79ad64382efae907ab4b56e6cfdece59822e25e48ede8957117065cdd Mar 18 12:34:09 crc kubenswrapper[4975]: W0318 12:34:09.886672 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode746cdd9_ae57_4577_a9c5_eafc0aa28c09.slice/crio-b19ea2084605f91ba747315a36df8757d98f35c17b13009e9c6f3eafdc48c5f9 WatchSource:0}: Error finding container b19ea2084605f91ba747315a36df8757d98f35c17b13009e9c6f3eafdc48c5f9: Status 404 returned error can't find the container with id b19ea2084605f91ba747315a36df8757d98f35c17b13009e9c6f3eafdc48c5f9 Mar 18 12:34:09 crc kubenswrapper[4975]: W0318 12:34:09.902233 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48f1daf8_3604_40e4_9e41_e9025c083c7d.slice/crio-120cebf7665e1c95d7ad66a6bc91fbfc2be309382735de46895f2b5631a2b1dc WatchSource:0}: Error finding container 120cebf7665e1c95d7ad66a6bc91fbfc2be309382735de46895f2b5631a2b1dc: Status 404 returned error can't find the container with id 120cebf7665e1c95d7ad66a6bc91fbfc2be309382735de46895f2b5631a2b1dc Mar 18 12:34:09 crc kubenswrapper[4975]: W0318 12:34:09.902536 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71ec6dfe_9c62_4027_a59f_fc13c24dd809.slice/crio-99c3e047426f11ac9e988dcaa9e990f1cfea4b9eeccbe65ffe5de614d6a5a671 WatchSource:0}: Error finding container 99c3e047426f11ac9e988dcaa9e990f1cfea4b9eeccbe65ffe5de614d6a5a671: Status 404 returned error can't find the container with id 99c3e047426f11ac9e988dcaa9e990f1cfea4b9eeccbe65ffe5de614d6a5a671 Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.628259 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-hwtrk" podStartSLOduration=5.830572368 podStartE2EDuration="33.628237133s" podCreationTimestamp="2026-03-18 12:33:37 +0000 UTC" firstStartedPulling="2026-03-18 12:33:39.282725792 +0000 UTC m=+1404.997126371" lastFinishedPulling="2026-03-18 12:34:07.080390557 +0000 UTC m=+1432.794791136" observedRunningTime="2026-03-18 12:34:09.77019611 +0000 UTC m=+1435.484596699" watchObservedRunningTime="2026-03-18 12:34:10.628237133 +0000 UTC m=+1436.342637712" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.679151 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-645d765cf7-6vwpp"] Mar 18 12:34:10 crc kubenswrapper[4975]: E0318 12:34:10.679581 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42860e95-d4ae-438b-a858-a88e69058574" containerName="dnsmasq-dns" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.679598 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="42860e95-d4ae-438b-a858-a88e69058574" containerName="dnsmasq-dns" Mar 18 12:34:10 crc kubenswrapper[4975]: E0318 12:34:10.679629 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42860e95-d4ae-438b-a858-a88e69058574" containerName="init" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.679636 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="42860e95-d4ae-438b-a858-a88e69058574" containerName="init" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.679889 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="42860e95-d4ae-438b-a858-a88e69058574" containerName="dnsmasq-dns" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.680972 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.691828 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.692163 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.720926 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-645d765cf7-6vwpp"] Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.795907 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"71ec6dfe-9c62-4027-a59f-fc13c24dd809","Type":"ContainerStarted","Data":"99c3e047426f11ac9e988dcaa9e990f1cfea4b9eeccbe65ffe5de614d6a5a671"} Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.799477 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5899b99ff6-cwt84" event={"ID":"48f1daf8-3604-40e4-9e41-e9025c083c7d","Type":"ContainerStarted","Data":"589ca94554a2c2a06045888f589afc811ca3abd6286fee87ebd2dc6e376739cf"} Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.799521 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5899b99ff6-cwt84" event={"ID":"48f1daf8-3604-40e4-9e41-e9025c083c7d","Type":"ContainerStarted","Data":"120cebf7665e1c95d7ad66a6bc91fbfc2be309382735de46895f2b5631a2b1dc"} Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.802177 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bfd49677-a36a-4e49-bdd9-4416f96c225e","Type":"ContainerStarted","Data":"9d4c8fcfebb7c5cb54476ee8b76598488b928761d25f2893d626b5d4693bd899"} Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.808477 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d","Type":"ContainerStarted","Data":"396b11744970cd6c2c092e95db41412f4b8ec80cdf0ce7bb8ef59f504a8da82f"} Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.813423 4975 generic.go:334] "Generic (PLEG): container finished" podID="0b6f390f-605a-4717-afea-39913c57679d" containerID="945e825484de7f95654c5f7f2dd28c8c859c5ee1613c98acee7057aef6450c91" exitCode=0 Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.813488 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-h4v54" event={"ID":"0b6f390f-605a-4717-afea-39913c57679d","Type":"ContainerDied","Data":"945e825484de7f95654c5f7f2dd28c8c859c5ee1613c98acee7057aef6450c91"} Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.813523 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-h4v54" event={"ID":"0b6f390f-605a-4717-afea-39913c57679d","Type":"ContainerStarted","Data":"0859eea3706a9df1edfc322cfd13aa96f3d26ddca921445a959f6b577f14621f"} Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.823601 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56bcd48494-744wv" event={"ID":"939992e3-94eb-4c98-a493-e30321c7f81a","Type":"ContainerStarted","Data":"18fc7aab9532a35db87599ed86d1d5ed76e91d3100ab1486b7f25d95d07a90df"} Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.823655 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56bcd48494-744wv" event={"ID":"939992e3-94eb-4c98-a493-e30321c7f81a","Type":"ContainerStarted","Data":"01eacfc75c8450371da5f0b5be104c8a0bab11ef61e355713072d9638602eac0"} Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.852748 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb7bf\" (UniqueName: \"kubernetes.io/projected/136388c3-08f6-404b-9a43-d68687d762c8-kube-api-access-bb7bf\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.852844 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-internal-tls-certs\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.852918 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-httpd-config\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.852937 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-public-tls-certs\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.852977 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-ovndb-tls-certs\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.852995 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-combined-ca-bundle\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.853035 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-config\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.862330 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563954-6tpj8" event={"ID":"e746cdd9-ae57-4577-a9c5-eafc0aa28c09","Type":"ContainerStarted","Data":"b19ea2084605f91ba747315a36df8757d98f35c17b13009e9c6f3eafdc48c5f9"} Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.863976 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wfp2q" event={"ID":"3247b82c-5928-4a38-85aa-d37c7d8f6c21","Type":"ContainerStarted","Data":"ad817110e62a67f7d915322c53393627cec21e6daa015f6acf407587979c543e"} Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.864061 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wfp2q" event={"ID":"3247b82c-5928-4a38-85aa-d37c7d8f6c21","Type":"ContainerStarted","Data":"31ac43e79ad64382efae907ab4b56e6cfdece59822e25e48ede8957117065cdd"} Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.884191 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c478d4794-x2t7q" event={"ID":"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7","Type":"ContainerStarted","Data":"085c3bcf8fe1108084d797f0f69756fc886932b4a0540f844652adcd1deb2117"} Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.902825 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wfp2q" podStartSLOduration=12.902809655 podStartE2EDuration="12.902809655s" podCreationTimestamp="2026-03-18 12:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:10.902568718 +0000 UTC m=+1436.616969297" watchObservedRunningTime="2026-03-18 12:34:10.902809655 +0000 UTC m=+1436.617210224" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.933268 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-c478d4794-x2t7q" podStartSLOduration=24.933250473 podStartE2EDuration="24.933250473s" podCreationTimestamp="2026-03-18 12:33:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:10.932428371 +0000 UTC m=+1436.646828950" watchObservedRunningTime="2026-03-18 12:34:10.933250473 +0000 UTC m=+1436.647651052" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.954818 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-httpd-config\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.954887 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-public-tls-certs\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.954975 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-ovndb-tls-certs\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.955003 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-combined-ca-bundle\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.955149 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-config\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.955327 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb7bf\" (UniqueName: \"kubernetes.io/projected/136388c3-08f6-404b-9a43-d68687d762c8-kube-api-access-bb7bf\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.955370 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-internal-tls-certs\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.961230 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-ovndb-tls-certs\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.969543 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-public-tls-certs\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.969760 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-config\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.970771 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-combined-ca-bundle\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.973522 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-internal-tls-certs\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.976138 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-httpd-config\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:10 crc kubenswrapper[4975]: I0318 12:34:10.990798 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb7bf\" (UniqueName: \"kubernetes.io/projected/136388c3-08f6-404b-9a43-d68687d762c8-kube-api-access-bb7bf\") pod \"neutron-645d765cf7-6vwpp\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:11 crc kubenswrapper[4975]: I0318 12:34:11.040951 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42860e95-d4ae-438b-a858-a88e69058574" path="/var/lib/kubelet/pods/42860e95-d4ae-438b-a858-a88e69058574/volumes" Mar 18 12:34:11 crc kubenswrapper[4975]: I0318 12:34:11.065319 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:11 crc kubenswrapper[4975]: I0318 12:34:11.890772 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-645d765cf7-6vwpp"] Mar 18 12:34:11 crc kubenswrapper[4975]: I0318 12:34:11.906274 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"71ec6dfe-9c62-4027-a59f-fc13c24dd809","Type":"ContainerStarted","Data":"547ebb2f0b54f4570c9dd75ee0eb6d6bbd3546bb0b427501074ea908ea63763f"} Mar 18 12:34:11 crc kubenswrapper[4975]: I0318 12:34:11.926558 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5899b99ff6-cwt84" event={"ID":"48f1daf8-3604-40e4-9e41-e9025c083c7d","Type":"ContainerStarted","Data":"54d62d06308503ce21e7991ebc2c1733e6936b83aaf6e8787b677c240f317b21"} Mar 18 12:34:11 crc kubenswrapper[4975]: I0318 12:34:11.927386 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:11 crc kubenswrapper[4975]: I0318 12:34:11.932907 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bfd49677-a36a-4e49-bdd9-4416f96c225e","Type":"ContainerStarted","Data":"a163e8f47e0240305d2703bff9e9c99c01e8bad06710b60b15b86e957d76b14b"} Mar 18 12:34:11 crc kubenswrapper[4975]: W0318 12:34:11.933660 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod136388c3_08f6_404b_9a43_d68687d762c8.slice/crio-4209b872a372909b1960ee78ff42c2c22dbe23136ba4c6a905c11083485b648e WatchSource:0}: Error finding container 4209b872a372909b1960ee78ff42c2c22dbe23136ba4c6a905c11083485b648e: Status 404 returned error can't find the container with id 4209b872a372909b1960ee78ff42c2c22dbe23136ba4c6a905c11083485b648e Mar 18 12:34:11 crc kubenswrapper[4975]: I0318 12:34:11.954776 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-h4v54" event={"ID":"0b6f390f-605a-4717-afea-39913c57679d","Type":"ContainerStarted","Data":"8dfb7cd29dd11dd05cf0855ac3831bd69ec45aadf4dfe3423be66679885c4aa8"} Mar 18 12:34:11 crc kubenswrapper[4975]: I0318 12:34:11.954790 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5899b99ff6-cwt84" podStartSLOduration=3.9547657579999997 podStartE2EDuration="3.954765758s" podCreationTimestamp="2026-03-18 12:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:11.948832565 +0000 UTC m=+1437.663233144" watchObservedRunningTime="2026-03-18 12:34:11.954765758 +0000 UTC m=+1437.669166337" Mar 18 12:34:11 crc kubenswrapper[4975]: I0318 12:34:11.955830 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:11 crc kubenswrapper[4975]: I0318 12:34:11.963727 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56bcd48494-744wv" event={"ID":"939992e3-94eb-4c98-a493-e30321c7f81a","Type":"ContainerStarted","Data":"812f2a378a50767a2a2d8ee2802beb163ceca1ae7f5a5ccc3d0089ede2f82099"} Mar 18 12:34:11 crc kubenswrapper[4975]: I0318 12:34:11.990373 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-h4v54" podStartSLOduration=3.990354708 podStartE2EDuration="3.990354708s" podCreationTimestamp="2026-03-18 12:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:11.978706268 +0000 UTC m=+1437.693106837" watchObservedRunningTime="2026-03-18 12:34:11.990354708 +0000 UTC m=+1437.704755287" Mar 18 12:34:12 crc kubenswrapper[4975]: I0318 12:34:12.009137 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-56bcd48494-744wv" podStartSLOduration=26.009116554 podStartE2EDuration="26.009116554s" podCreationTimestamp="2026-03-18 12:33:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:12.002313077 +0000 UTC m=+1437.716713676" watchObservedRunningTime="2026-03-18 12:34:12.009116554 +0000 UTC m=+1437.723517143" Mar 18 12:34:13 crc kubenswrapper[4975]: I0318 12:34:13.005137 4975 generic.go:334] "Generic (PLEG): container finished" podID="c2614112-7379-4588-a6dd-1cec6e3d96b4" containerID="be07b1fb2431f75c390014aa13497026e975f154b46b324a8a5e1a634690363f" exitCode=0 Mar 18 12:34:13 crc kubenswrapper[4975]: I0318 12:34:13.005964 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hwtrk" event={"ID":"c2614112-7379-4588-a6dd-1cec6e3d96b4","Type":"ContainerDied","Data":"be07b1fb2431f75c390014aa13497026e975f154b46b324a8a5e1a634690363f"} Mar 18 12:34:13 crc kubenswrapper[4975]: I0318 12:34:13.012467 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"71ec6dfe-9c62-4027-a59f-fc13c24dd809","Type":"ContainerStarted","Data":"1ff4cbf09d61c6c3d56720ba50a74563fa720913cbaf662951ca4c4549617c84"} Mar 18 12:34:13 crc kubenswrapper[4975]: I0318 12:34:13.041193 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bfd49677-a36a-4e49-bdd9-4416f96c225e" containerName="glance-log" containerID="cri-o://a163e8f47e0240305d2703bff9e9c99c01e8bad06710b60b15b86e957d76b14b" gracePeriod=30 Mar 18 12:34:13 crc kubenswrapper[4975]: I0318 12:34:13.042075 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bfd49677-a36a-4e49-bdd9-4416f96c225e" containerName="glance-httpd" containerID="cri-o://49e6957ed590f2fd18af265e69b60009b5f675bbdb85c88aa8373a4873602cc4" gracePeriod=30 Mar 18 12:34:13 crc kubenswrapper[4975]: I0318 12:34:13.043927 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-645d765cf7-6vwpp" event={"ID":"136388c3-08f6-404b-9a43-d68687d762c8","Type":"ContainerStarted","Data":"4209b872a372909b1960ee78ff42c2c22dbe23136ba4c6a905c11083485b648e"} Mar 18 12:34:13 crc kubenswrapper[4975]: I0318 12:34:13.043957 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bfd49677-a36a-4e49-bdd9-4416f96c225e","Type":"ContainerStarted","Data":"49e6957ed590f2fd18af265e69b60009b5f675bbdb85c88aa8373a4873602cc4"} Mar 18 12:34:13 crc kubenswrapper[4975]: I0318 12:34:13.070643 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=19.07062248 podStartE2EDuration="19.07062248s" podCreationTimestamp="2026-03-18 12:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:13.056542203 +0000 UTC m=+1438.770942782" watchObservedRunningTime="2026-03-18 12:34:13.07062248 +0000 UTC m=+1438.785023059" Mar 18 12:34:13 crc kubenswrapper[4975]: I0318 12:34:13.106578 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=30.10655626 podStartE2EDuration="30.10655626s" podCreationTimestamp="2026-03-18 12:33:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:13.098705954 +0000 UTC m=+1438.813106533" watchObservedRunningTime="2026-03-18 12:34:13.10655626 +0000 UTC m=+1438.820956839" Mar 18 12:34:13 crc kubenswrapper[4975]: I0318 12:34:13.802995 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 12:34:13 crc kubenswrapper[4975]: I0318 12:34:13.803359 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 12:34:14 crc kubenswrapper[4975]: I0318 12:34:14.056381 4975 generic.go:334] "Generic (PLEG): container finished" podID="bfd49677-a36a-4e49-bdd9-4416f96c225e" containerID="49e6957ed590f2fd18af265e69b60009b5f675bbdb85c88aa8373a4873602cc4" exitCode=0 Mar 18 12:34:14 crc kubenswrapper[4975]: I0318 12:34:14.056692 4975 generic.go:334] "Generic (PLEG): container finished" podID="bfd49677-a36a-4e49-bdd9-4416f96c225e" containerID="a163e8f47e0240305d2703bff9e9c99c01e8bad06710b60b15b86e957d76b14b" exitCode=143 Mar 18 12:34:14 crc kubenswrapper[4975]: I0318 12:34:14.056749 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bfd49677-a36a-4e49-bdd9-4416f96c225e","Type":"ContainerDied","Data":"49e6957ed590f2fd18af265e69b60009b5f675bbdb85c88aa8373a4873602cc4"} Mar 18 12:34:14 crc kubenswrapper[4975]: I0318 12:34:14.056774 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bfd49677-a36a-4e49-bdd9-4416f96c225e","Type":"ContainerDied","Data":"a163e8f47e0240305d2703bff9e9c99c01e8bad06710b60b15b86e957d76b14b"} Mar 18 12:34:14 crc kubenswrapper[4975]: I0318 12:34:14.058281 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563954-6tpj8" event={"ID":"e746cdd9-ae57-4577-a9c5-eafc0aa28c09","Type":"ContainerStarted","Data":"278cdef0370e38cfc3e1a8277cf552974323a7b704059e143c2185c932a7dcfe"} Mar 18 12:34:14 crc kubenswrapper[4975]: I0318 12:34:14.063096 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-645d765cf7-6vwpp" event={"ID":"136388c3-08f6-404b-9a43-d68687d762c8","Type":"ContainerStarted","Data":"0b002b0ac8c9c3a8c4697ac79ac60aad1dddf8d8c6ba6033e81207d9607b7e11"} Mar 18 12:34:14 crc kubenswrapper[4975]: I0318 12:34:14.076451 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563954-6tpj8" podStartSLOduration=11.860894673 podStartE2EDuration="14.076434693s" podCreationTimestamp="2026-03-18 12:34:00 +0000 UTC" firstStartedPulling="2026-03-18 12:34:09.894042681 +0000 UTC m=+1435.608443260" lastFinishedPulling="2026-03-18 12:34:12.109582701 +0000 UTC m=+1437.823983280" observedRunningTime="2026-03-18 12:34:14.076030952 +0000 UTC m=+1439.790431531" watchObservedRunningTime="2026-03-18 12:34:14.076434693 +0000 UTC m=+1439.790835272" Mar 18 12:34:14 crc kubenswrapper[4975]: I0318 12:34:14.950671 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 12:34:14 crc kubenswrapper[4975]: I0318 12:34:14.950725 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 12:34:14 crc kubenswrapper[4975]: I0318 12:34:14.989120 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 12:34:14 crc kubenswrapper[4975]: I0318 12:34:14.998415 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 12:34:15 crc kubenswrapper[4975]: I0318 12:34:15.088627 4975 generic.go:334] "Generic (PLEG): container finished" podID="e746cdd9-ae57-4577-a9c5-eafc0aa28c09" containerID="278cdef0370e38cfc3e1a8277cf552974323a7b704059e143c2185c932a7dcfe" exitCode=0 Mar 18 12:34:15 crc kubenswrapper[4975]: I0318 12:34:15.089704 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563954-6tpj8" event={"ID":"e746cdd9-ae57-4577-a9c5-eafc0aa28c09","Type":"ContainerDied","Data":"278cdef0370e38cfc3e1a8277cf552974323a7b704059e143c2185c932a7dcfe"} Mar 18 12:34:15 crc kubenswrapper[4975]: I0318 12:34:15.092104 4975 generic.go:334] "Generic (PLEG): container finished" podID="3247b82c-5928-4a38-85aa-d37c7d8f6c21" containerID="ad817110e62a67f7d915322c53393627cec21e6daa015f6acf407587979c543e" exitCode=0 Mar 18 12:34:15 crc kubenswrapper[4975]: I0318 12:34:15.092152 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wfp2q" event={"ID":"3247b82c-5928-4a38-85aa-d37c7d8f6c21","Type":"ContainerDied","Data":"ad817110e62a67f7d915322c53393627cec21e6daa015f6acf407587979c543e"} Mar 18 12:34:15 crc kubenswrapper[4975]: I0318 12:34:15.096140 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-645d765cf7-6vwpp" event={"ID":"136388c3-08f6-404b-9a43-d68687d762c8","Type":"ContainerStarted","Data":"6f727425465df55d2fac6f8f433eef3e30b04a52f9113ea6c12842151987a9bf"} Mar 18 12:34:15 crc kubenswrapper[4975]: I0318 12:34:15.096188 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 12:34:15 crc kubenswrapper[4975]: I0318 12:34:15.096202 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:15 crc kubenswrapper[4975]: I0318 12:34:15.096227 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 12:34:15 crc kubenswrapper[4975]: I0318 12:34:15.169178 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-645d765cf7-6vwpp" podStartSLOduration=5.169161869 podStartE2EDuration="5.169161869s" podCreationTimestamp="2026-03-18 12:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:15.14812715 +0000 UTC m=+1440.862527729" watchObservedRunningTime="2026-03-18 12:34:15.169161869 +0000 UTC m=+1440.883562448" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.470989 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.473604 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hwtrk" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.592437 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qqs7\" (UniqueName: \"kubernetes.io/projected/bfd49677-a36a-4e49-bdd9-4416f96c225e-kube-api-access-2qqs7\") pod \"bfd49677-a36a-4e49-bdd9-4416f96c225e\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.592510 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bfd49677-a36a-4e49-bdd9-4416f96c225e-httpd-run\") pod \"bfd49677-a36a-4e49-bdd9-4416f96c225e\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.592566 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-scripts\") pod \"c2614112-7379-4588-a6dd-1cec6e3d96b4\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.592603 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-combined-ca-bundle\") pod \"c2614112-7379-4588-a6dd-1cec6e3d96b4\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.592630 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-config-data\") pod \"c2614112-7379-4588-a6dd-1cec6e3d96b4\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.592675 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2614112-7379-4588-a6dd-1cec6e3d96b4-logs\") pod \"c2614112-7379-4588-a6dd-1cec6e3d96b4\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.592703 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-internal-tls-certs\") pod \"bfd49677-a36a-4e49-bdd9-4416f96c225e\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.592732 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd49677-a36a-4e49-bdd9-4416f96c225e-logs\") pod \"bfd49677-a36a-4e49-bdd9-4416f96c225e\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.592795 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-scripts\") pod \"bfd49677-a36a-4e49-bdd9-4416f96c225e\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.592824 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzjdq\" (UniqueName: \"kubernetes.io/projected/c2614112-7379-4588-a6dd-1cec6e3d96b4-kube-api-access-qzjdq\") pod \"c2614112-7379-4588-a6dd-1cec6e3d96b4\" (UID: \"c2614112-7379-4588-a6dd-1cec6e3d96b4\") " Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.592851 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-config-data\") pod \"bfd49677-a36a-4e49-bdd9-4416f96c225e\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.592903 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-combined-ca-bundle\") pod \"bfd49677-a36a-4e49-bdd9-4416f96c225e\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.592941 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"bfd49677-a36a-4e49-bdd9-4416f96c225e\" (UID: \"bfd49677-a36a-4e49-bdd9-4416f96c225e\") " Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.593627 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfd49677-a36a-4e49-bdd9-4416f96c225e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bfd49677-a36a-4e49-bdd9-4416f96c225e" (UID: "bfd49677-a36a-4e49-bdd9-4416f96c225e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.593711 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2614112-7379-4588-a6dd-1cec6e3d96b4-logs" (OuterVolumeSpecName: "logs") pod "c2614112-7379-4588-a6dd-1cec6e3d96b4" (UID: "c2614112-7379-4588-a6dd-1cec6e3d96b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.594090 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfd49677-a36a-4e49-bdd9-4416f96c225e-logs" (OuterVolumeSpecName: "logs") pod "bfd49677-a36a-4e49-bdd9-4416f96c225e" (UID: "bfd49677-a36a-4e49-bdd9-4416f96c225e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.599156 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2614112-7379-4588-a6dd-1cec6e3d96b4-kube-api-access-qzjdq" (OuterVolumeSpecName: "kube-api-access-qzjdq") pod "c2614112-7379-4588-a6dd-1cec6e3d96b4" (UID: "c2614112-7379-4588-a6dd-1cec6e3d96b4"). InnerVolumeSpecName "kube-api-access-qzjdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.600583 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-scripts" (OuterVolumeSpecName: "scripts") pod "c2614112-7379-4588-a6dd-1cec6e3d96b4" (UID: "c2614112-7379-4588-a6dd-1cec6e3d96b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.601249 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd49677-a36a-4e49-bdd9-4416f96c225e-kube-api-access-2qqs7" (OuterVolumeSpecName: "kube-api-access-2qqs7") pod "bfd49677-a36a-4e49-bdd9-4416f96c225e" (UID: "bfd49677-a36a-4e49-bdd9-4416f96c225e"). InnerVolumeSpecName "kube-api-access-2qqs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.609099 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "bfd49677-a36a-4e49-bdd9-4416f96c225e" (UID: "bfd49677-a36a-4e49-bdd9-4416f96c225e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.614039 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-scripts" (OuterVolumeSpecName: "scripts") pod "bfd49677-a36a-4e49-bdd9-4416f96c225e" (UID: "bfd49677-a36a-4e49-bdd9-4416f96c225e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.638481 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2614112-7379-4588-a6dd-1cec6e3d96b4" (UID: "c2614112-7379-4588-a6dd-1cec6e3d96b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.651443 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-config-data" (OuterVolumeSpecName: "config-data") pod "c2614112-7379-4588-a6dd-1cec6e3d96b4" (UID: "c2614112-7379-4588-a6dd-1cec6e3d96b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.660329 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfd49677-a36a-4e49-bdd9-4416f96c225e" (UID: "bfd49677-a36a-4e49-bdd9-4416f96c225e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.666762 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-config-data" (OuterVolumeSpecName: "config-data") pod "bfd49677-a36a-4e49-bdd9-4416f96c225e" (UID: "bfd49677-a36a-4e49-bdd9-4416f96c225e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.672743 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bfd49677-a36a-4e49-bdd9-4416f96c225e" (UID: "bfd49677-a36a-4e49-bdd9-4416f96c225e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.679798 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.679971 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.696934 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qqs7\" (UniqueName: \"kubernetes.io/projected/bfd49677-a36a-4e49-bdd9-4416f96c225e-kube-api-access-2qqs7\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.696982 4975 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bfd49677-a36a-4e49-bdd9-4416f96c225e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.696995 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.697004 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.697012 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2614112-7379-4588-a6dd-1cec6e3d96b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.697021 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2614112-7379-4588-a6dd-1cec6e3d96b4-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.697029 4975 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.697037 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfd49677-a36a-4e49-bdd9-4416f96c225e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.697044 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.697052 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzjdq\" (UniqueName: \"kubernetes.io/projected/c2614112-7379-4588-a6dd-1cec6e3d96b4-kube-api-access-qzjdq\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.697060 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.697067 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfd49677-a36a-4e49-bdd9-4416f96c225e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.697098 4975 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.728805 4975 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.757052 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.757119 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:34:16 crc kubenswrapper[4975]: I0318 12:34:16.798758 4975 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.124826 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bfd49677-a36a-4e49-bdd9-4416f96c225e","Type":"ContainerDied","Data":"9d4c8fcfebb7c5cb54476ee8b76598488b928761d25f2893d626b5d4693bd899"} Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.124930 4975 scope.go:117] "RemoveContainer" containerID="49e6957ed590f2fd18af265e69b60009b5f675bbdb85c88aa8373a4873602cc4" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.124853 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.129117 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hwtrk" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.129767 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hwtrk" event={"ID":"c2614112-7379-4588-a6dd-1cec6e3d96b4","Type":"ContainerDied","Data":"afd46f28a4febcc48272cab1d92b756bc87de7856c345057be6239feeca0b484"} Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.129803 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afd46f28a4febcc48272cab1d92b756bc87de7856c345057be6239feeca0b484" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.129847 4975 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.151473 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.169816 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.208337 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:34:17 crc kubenswrapper[4975]: E0318 12:34:17.208686 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd49677-a36a-4e49-bdd9-4416f96c225e" containerName="glance-httpd" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.208702 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd49677-a36a-4e49-bdd9-4416f96c225e" containerName="glance-httpd" Mar 18 12:34:17 crc kubenswrapper[4975]: E0318 12:34:17.208723 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd49677-a36a-4e49-bdd9-4416f96c225e" containerName="glance-log" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.208729 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd49677-a36a-4e49-bdd9-4416f96c225e" containerName="glance-log" Mar 18 12:34:17 crc kubenswrapper[4975]: E0318 12:34:17.208762 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2614112-7379-4588-a6dd-1cec6e3d96b4" containerName="placement-db-sync" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.208768 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2614112-7379-4588-a6dd-1cec6e3d96b4" containerName="placement-db-sync" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.208935 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd49677-a36a-4e49-bdd9-4416f96c225e" containerName="glance-log" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.208968 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2614112-7379-4588-a6dd-1cec6e3d96b4" containerName="placement-db-sync" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.208979 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd49677-a36a-4e49-bdd9-4416f96c225e" containerName="glance-httpd" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.209943 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.215474 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.216739 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.220307 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.307305 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.307363 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cab6fd53-f170-4c86-b5eb-3590e593077e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.307396 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.307731 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dw7t\" (UniqueName: \"kubernetes.io/projected/cab6fd53-f170-4c86-b5eb-3590e593077e-kube-api-access-7dw7t\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.307798 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab6fd53-f170-4c86-b5eb-3590e593077e-logs\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.307973 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.308016 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.308073 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.409698 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.409752 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.409793 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.409834 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cab6fd53-f170-4c86-b5eb-3590e593077e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.409854 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.409946 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dw7t\" (UniqueName: \"kubernetes.io/projected/cab6fd53-f170-4c86-b5eb-3590e593077e-kube-api-access-7dw7t\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.409967 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab6fd53-f170-4c86-b5eb-3590e593077e-logs\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.410010 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.410454 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.410506 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cab6fd53-f170-4c86-b5eb-3590e593077e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.410788 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab6fd53-f170-4c86-b5eb-3590e593077e-logs\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.418585 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.418957 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.419580 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.431423 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.438036 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dw7t\" (UniqueName: \"kubernetes.io/projected/cab6fd53-f170-4c86-b5eb-3590e593077e-kube-api-access-7dw7t\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.447587 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.455939 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.534838 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.596684 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-587979d76d-qg8cs"] Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.598473 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.604409 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.604732 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.604909 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rkhlv" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.605127 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.607735 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.639387 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-587979d76d-qg8cs"] Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.717305 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-internal-tls-certs\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.717367 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f967c09a-49f5-4a4c-a1a9-7bb2da157132-logs\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.717407 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-scripts\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.717437 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qxc9\" (UniqueName: \"kubernetes.io/projected/f967c09a-49f5-4a4c-a1a9-7bb2da157132-kube-api-access-9qxc9\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.717493 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-public-tls-certs\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.717550 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-combined-ca-bundle\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.717579 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-config-data\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.819000 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-public-tls-certs\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.827992 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-combined-ca-bundle\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.828049 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-config-data\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.828761 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-internal-tls-certs\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.828798 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f967c09a-49f5-4a4c-a1a9-7bb2da157132-logs\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.828850 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-scripts\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.828925 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qxc9\" (UniqueName: \"kubernetes.io/projected/f967c09a-49f5-4a4c-a1a9-7bb2da157132-kube-api-access-9qxc9\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.829628 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f967c09a-49f5-4a4c-a1a9-7bb2da157132-logs\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.832600 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-internal-tls-certs\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.834464 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-config-data\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.834942 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-public-tls-certs\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.837600 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-combined-ca-bundle\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.846962 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-scripts\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.851629 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qxc9\" (UniqueName: \"kubernetes.io/projected/f967c09a-49f5-4a4c-a1a9-7bb2da157132-kube-api-access-9qxc9\") pod \"placement-587979d76d-qg8cs\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:17 crc kubenswrapper[4975]: I0318 12:34:17.932356 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:18 crc kubenswrapper[4975]: I0318 12:34:18.428828 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:34:18 crc kubenswrapper[4975]: I0318 12:34:18.880068 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:18 crc kubenswrapper[4975]: I0318 12:34:18.974192 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-p5nbn"] Mar 18 12:34:18 crc kubenswrapper[4975]: I0318 12:34:18.981155 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" podUID="effc30d8-1f26-461c-8555-86410a3f77ba" containerName="dnsmasq-dns" containerID="cri-o://e2c46b51d17c388811ca2cf84335f0aff97529cd284600b76bfc32a76281329e" gracePeriod=10 Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.029415 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfd49677-a36a-4e49-bdd9-4416f96c225e" path="/var/lib/kubelet/pods/bfd49677-a36a-4e49-bdd9-4416f96c225e/volumes" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.150691 4975 generic.go:334] "Generic (PLEG): container finished" podID="effc30d8-1f26-461c-8555-86410a3f77ba" containerID="e2c46b51d17c388811ca2cf84335f0aff97529cd284600b76bfc32a76281329e" exitCode=0 Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.150737 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" event={"ID":"effc30d8-1f26-461c-8555-86410a3f77ba","Type":"ContainerDied","Data":"e2c46b51d17c388811ca2cf84335f0aff97529cd284600b76bfc32a76281329e"} Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.419038 4975 scope.go:117] "RemoveContainer" containerID="a163e8f47e0240305d2703bff9e9c99c01e8bad06710b60b15b86e957d76b14b" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.582730 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.599384 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563954-6tpj8" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.674228 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-scripts\") pod \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.674311 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-credential-keys\") pod \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.674407 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-combined-ca-bundle\") pod \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.674477 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-fernet-keys\") pod \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.674518 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpmgz\" (UniqueName: \"kubernetes.io/projected/3247b82c-5928-4a38-85aa-d37c7d8f6c21-kube-api-access-wpmgz\") pod \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.674548 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-config-data\") pod \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\" (UID: \"3247b82c-5928-4a38-85aa-d37c7d8f6c21\") " Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.674596 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms6rm\" (UniqueName: \"kubernetes.io/projected/e746cdd9-ae57-4577-a9c5-eafc0aa28c09-kube-api-access-ms6rm\") pod \"e746cdd9-ae57-4577-a9c5-eafc0aa28c09\" (UID: \"e746cdd9-ae57-4577-a9c5-eafc0aa28c09\") " Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.694070 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-scripts" (OuterVolumeSpecName: "scripts") pod "3247b82c-5928-4a38-85aa-d37c7d8f6c21" (UID: "3247b82c-5928-4a38-85aa-d37c7d8f6c21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.698044 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e746cdd9-ae57-4577-a9c5-eafc0aa28c09-kube-api-access-ms6rm" (OuterVolumeSpecName: "kube-api-access-ms6rm") pod "e746cdd9-ae57-4577-a9c5-eafc0aa28c09" (UID: "e746cdd9-ae57-4577-a9c5-eafc0aa28c09"). InnerVolumeSpecName "kube-api-access-ms6rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.700150 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3247b82c-5928-4a38-85aa-d37c7d8f6c21" (UID: "3247b82c-5928-4a38-85aa-d37c7d8f6c21"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.709401 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3247b82c-5928-4a38-85aa-d37c7d8f6c21-kube-api-access-wpmgz" (OuterVolumeSpecName: "kube-api-access-wpmgz") pod "3247b82c-5928-4a38-85aa-d37c7d8f6c21" (UID: "3247b82c-5928-4a38-85aa-d37c7d8f6c21"). InnerVolumeSpecName "kube-api-access-wpmgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.710995 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3247b82c-5928-4a38-85aa-d37c7d8f6c21" (UID: "3247b82c-5928-4a38-85aa-d37c7d8f6c21"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.772048 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-config-data" (OuterVolumeSpecName: "config-data") pod "3247b82c-5928-4a38-85aa-d37c7d8f6c21" (UID: "3247b82c-5928-4a38-85aa-d37c7d8f6c21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.773095 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3247b82c-5928-4a38-85aa-d37c7d8f6c21" (UID: "3247b82c-5928-4a38-85aa-d37c7d8f6c21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.779080 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.779113 4975 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.779125 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.779170 4975 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.779181 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpmgz\" (UniqueName: \"kubernetes.io/projected/3247b82c-5928-4a38-85aa-d37c7d8f6c21-kube-api-access-wpmgz\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.779190 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3247b82c-5928-4a38-85aa-d37c7d8f6c21-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.779200 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms6rm\" (UniqueName: \"kubernetes.io/projected/e746cdd9-ae57-4577-a9c5-eafc0aa28c09-kube-api-access-ms6rm\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.795840 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.815258 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.881396 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-dns-svc\") pod \"effc30d8-1f26-461c-8555-86410a3f77ba\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.881471 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-ovsdbserver-nb\") pod \"effc30d8-1f26-461c-8555-86410a3f77ba\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.881551 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-dns-swift-storage-0\") pod \"effc30d8-1f26-461c-8555-86410a3f77ba\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.881591 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7wxz\" (UniqueName: \"kubernetes.io/projected/effc30d8-1f26-461c-8555-86410a3f77ba-kube-api-access-v7wxz\") pod \"effc30d8-1f26-461c-8555-86410a3f77ba\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.881657 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-config\") pod \"effc30d8-1f26-461c-8555-86410a3f77ba\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.881703 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-ovsdbserver-sb\") pod \"effc30d8-1f26-461c-8555-86410a3f77ba\" (UID: \"effc30d8-1f26-461c-8555-86410a3f77ba\") " Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.931484 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/effc30d8-1f26-461c-8555-86410a3f77ba-kube-api-access-v7wxz" (OuterVolumeSpecName: "kube-api-access-v7wxz") pod "effc30d8-1f26-461c-8555-86410a3f77ba" (UID: "effc30d8-1f26-461c-8555-86410a3f77ba"). InnerVolumeSpecName "kube-api-access-v7wxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:19 crc kubenswrapper[4975]: I0318 12:34:19.984180 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7wxz\" (UniqueName: \"kubernetes.io/projected/effc30d8-1f26-461c-8555-86410a3f77ba-kube-api-access-v7wxz\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.065296 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-587979d76d-qg8cs"] Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.150186 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-config" (OuterVolumeSpecName: "config") pod "effc30d8-1f26-461c-8555-86410a3f77ba" (UID: "effc30d8-1f26-461c-8555-86410a3f77ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.177676 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "effc30d8-1f26-461c-8555-86410a3f77ba" (UID: "effc30d8-1f26-461c-8555-86410a3f77ba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.181925 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "effc30d8-1f26-461c-8555-86410a3f77ba" (UID: "effc30d8-1f26-461c-8555-86410a3f77ba"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.183670 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "effc30d8-1f26-461c-8555-86410a3f77ba" (UID: "effc30d8-1f26-461c-8555-86410a3f77ba"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.183747 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "effc30d8-1f26-461c-8555-86410a3f77ba" (UID: "effc30d8-1f26-461c-8555-86410a3f77ba"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.190122 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.190143 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.190152 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.190160 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.190168 4975 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/effc30d8-1f26-461c-8555-86410a3f77ba-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.191859 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563954-6tpj8" event={"ID":"e746cdd9-ae57-4577-a9c5-eafc0aa28c09","Type":"ContainerDied","Data":"b19ea2084605f91ba747315a36df8757d98f35c17b13009e9c6f3eafdc48c5f9"} Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.192003 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b19ea2084605f91ba747315a36df8757d98f35c17b13009e9c6f3eafdc48c5f9" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.192090 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563954-6tpj8" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.204810 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-587979d76d-qg8cs" event={"ID":"f967c09a-49f5-4a4c-a1a9-7bb2da157132","Type":"ContainerStarted","Data":"b7989a097f0537c61ad811f0ed4617544ebc5ab98ef3691949caa1c18fb9812d"} Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.210202 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wfp2q" event={"ID":"3247b82c-5928-4a38-85aa-d37c7d8f6c21","Type":"ContainerDied","Data":"31ac43e79ad64382efae907ab4b56e6cfdece59822e25e48ede8957117065cdd"} Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.210237 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31ac43e79ad64382efae907ab4b56e6cfdece59822e25e48ede8957117065cdd" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.210319 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wfp2q" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.221060 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" event={"ID":"effc30d8-1f26-461c-8555-86410a3f77ba","Type":"ContainerDied","Data":"535b9c48eed1c02eee3277ce4f35ce988aabd14f0d25a80215c76be9ecc576bc"} Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.221109 4975 scope.go:117] "RemoveContainer" containerID="e2c46b51d17c388811ca2cf84335f0aff97529cd284600b76bfc32a76281329e" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.221237 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-p5nbn" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.248353 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d","Type":"ContainerStarted","Data":"464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f"} Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.250246 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-psw5k" event={"ID":"4f054ebc-151c-4e89-8242-3837c9bee6b2","Type":"ContainerStarted","Data":"77eec9a0c105490c29fb8bed80b81fecf8fa74501cf1bb4d42fcb5f966879a84"} Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.272016 4975 scope.go:117] "RemoveContainer" containerID="659e46f93d799d8b7fc87a5866856463f234fc3f6280dbcd0b55c487cdc153e4" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.275923 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-p5nbn"] Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.294663 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-p5nbn"] Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.303502 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-psw5k" podStartSLOduration=3.090465238 podStartE2EDuration="43.303477388s" podCreationTimestamp="2026-03-18 12:33:37 +0000 UTC" firstStartedPulling="2026-03-18 12:33:39.327140485 +0000 UTC m=+1405.041541064" lastFinishedPulling="2026-03-18 12:34:19.540152635 +0000 UTC m=+1445.254553214" observedRunningTime="2026-03-18 12:34:20.280342191 +0000 UTC m=+1445.994742770" watchObservedRunningTime="2026-03-18 12:34:20.303477388 +0000 UTC m=+1446.017877967" Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.330159 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.693943 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563948-hnfx6"] Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.701589 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563948-hnfx6"] Mar 18 12:34:20 crc kubenswrapper[4975]: I0318 12:34:20.991375 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f8786c66f-vjkt4"] Mar 18 12:34:21 crc kubenswrapper[4975]: E0318 12:34:21.001314 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e746cdd9-ae57-4577-a9c5-eafc0aa28c09" containerName="oc" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.001354 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="e746cdd9-ae57-4577-a9c5-eafc0aa28c09" containerName="oc" Mar 18 12:34:21 crc kubenswrapper[4975]: E0318 12:34:21.001386 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effc30d8-1f26-461c-8555-86410a3f77ba" containerName="dnsmasq-dns" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.001395 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="effc30d8-1f26-461c-8555-86410a3f77ba" containerName="dnsmasq-dns" Mar 18 12:34:21 crc kubenswrapper[4975]: E0318 12:34:21.001434 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effc30d8-1f26-461c-8555-86410a3f77ba" containerName="init" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.001451 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="effc30d8-1f26-461c-8555-86410a3f77ba" containerName="init" Mar 18 12:34:21 crc kubenswrapper[4975]: E0318 12:34:21.001464 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3247b82c-5928-4a38-85aa-d37c7d8f6c21" containerName="keystone-bootstrap" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.001470 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="3247b82c-5928-4a38-85aa-d37c7d8f6c21" containerName="keystone-bootstrap" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.001881 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="effc30d8-1f26-461c-8555-86410a3f77ba" containerName="dnsmasq-dns" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.001911 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="3247b82c-5928-4a38-85aa-d37c7d8f6c21" containerName="keystone-bootstrap" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.001930 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="e746cdd9-ae57-4577-a9c5-eafc0aa28c09" containerName="oc" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.003488 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f8786c66f-vjkt4"] Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.004930 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.010003 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.010022 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.010153 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d6c9r" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.010241 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.010389 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.010393 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.029526 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d745b1ee-2087-4b83-a5ea-5519ef205da0" path="/var/lib/kubelet/pods/d745b1ee-2087-4b83-a5ea-5519ef205da0/volumes" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.030195 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="effc30d8-1f26-461c-8555-86410a3f77ba" path="/var/lib/kubelet/pods/effc30d8-1f26-461c-8555-86410a3f77ba/volumes" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.108087 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-credential-keys\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.108124 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcbxb\" (UniqueName: \"kubernetes.io/projected/b782f13d-9abe-4abe-a47e-9c378c9d1913-kube-api-access-qcbxb\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.108174 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-config-data\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.108383 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-fernet-keys\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.108618 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-scripts\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.108687 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-combined-ca-bundle\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.108712 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-public-tls-certs\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.108737 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-internal-tls-certs\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.216206 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcbxb\" (UniqueName: \"kubernetes.io/projected/b782f13d-9abe-4abe-a47e-9c378c9d1913-kube-api-access-qcbxb\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.216287 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-config-data\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.216324 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-fernet-keys\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.216400 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-scripts\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.216442 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-combined-ca-bundle\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.216459 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-public-tls-certs\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.216475 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-internal-tls-certs\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.216538 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-credential-keys\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.221203 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-config-data\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.223101 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-fernet-keys\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.226079 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-public-tls-certs\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.227344 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-credential-keys\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.229222 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-combined-ca-bundle\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.231513 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-internal-tls-certs\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.233075 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcbxb\" (UniqueName: \"kubernetes.io/projected/b782f13d-9abe-4abe-a47e-9c378c9d1913-kube-api-access-qcbxb\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.234565 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b782f13d-9abe-4abe-a47e-9c378c9d1913-scripts\") pod \"keystone-f8786c66f-vjkt4\" (UID: \"b782f13d-9abe-4abe-a47e-9c378c9d1913\") " pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.272082 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cab6fd53-f170-4c86-b5eb-3590e593077e","Type":"ContainerStarted","Data":"cbba75c95ef2c8b19267cb12a50564144e1b272035837c9a7bfefb493b93977d"} Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.275325 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-587979d76d-qg8cs" event={"ID":"f967c09a-49f5-4a4c-a1a9-7bb2da157132","Type":"ContainerStarted","Data":"458781d0a4bccfcc7f57ea793253b6177a97e2b88c696641ce09fe6e28923fc2"} Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.417956 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:21 crc kubenswrapper[4975]: I0318 12:34:21.923469 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f8786c66f-vjkt4"] Mar 18 12:34:22 crc kubenswrapper[4975]: I0318 12:34:22.298703 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cab6fd53-f170-4c86-b5eb-3590e593077e","Type":"ContainerStarted","Data":"89355303cae12366ce92616e4a15c650c55c9e68f847c6ad92d3f445b2cb1d23"} Mar 18 12:34:22 crc kubenswrapper[4975]: I0318 12:34:22.299062 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cab6fd53-f170-4c86-b5eb-3590e593077e","Type":"ContainerStarted","Data":"80b75050c0f0e18d1719cd234447641097a01bc47df078532bbb63139fc9fa89"} Mar 18 12:34:22 crc kubenswrapper[4975]: I0318 12:34:22.304479 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f8786c66f-vjkt4" event={"ID":"b782f13d-9abe-4abe-a47e-9c378c9d1913","Type":"ContainerStarted","Data":"07db61717482d98326d002b31edb4c364c712e476e3290ab188164137125335e"} Mar 18 12:34:22 crc kubenswrapper[4975]: I0318 12:34:22.304527 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f8786c66f-vjkt4" event={"ID":"b782f13d-9abe-4abe-a47e-9c378c9d1913","Type":"ContainerStarted","Data":"f1cc00630c09828ac5cccccd6abff5bd13320ef33b423ba2c3a715f6be7613cd"} Mar 18 12:34:22 crc kubenswrapper[4975]: I0318 12:34:22.305433 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:22 crc kubenswrapper[4975]: I0318 12:34:22.308329 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-587979d76d-qg8cs" event={"ID":"f967c09a-49f5-4a4c-a1a9-7bb2da157132","Type":"ContainerStarted","Data":"4b03eed16205c65408f7dc224e2fd5897758a55b75be64f724d52b8b56190a6d"} Mar 18 12:34:22 crc kubenswrapper[4975]: I0318 12:34:22.308552 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:22 crc kubenswrapper[4975]: I0318 12:34:22.308576 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:22 crc kubenswrapper[4975]: I0318 12:34:22.324540 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.324516552 podStartE2EDuration="5.324516552s" podCreationTimestamp="2026-03-18 12:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:22.319605187 +0000 UTC m=+1448.034005776" watchObservedRunningTime="2026-03-18 12:34:22.324516552 +0000 UTC m=+1448.038917141" Mar 18 12:34:22 crc kubenswrapper[4975]: I0318 12:34:22.373697 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-f8786c66f-vjkt4" podStartSLOduration=2.373666986 podStartE2EDuration="2.373666986s" podCreationTimestamp="2026-03-18 12:34:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:22.343010552 +0000 UTC m=+1448.057411151" watchObservedRunningTime="2026-03-18 12:34:22.373666986 +0000 UTC m=+1448.088067575" Mar 18 12:34:22 crc kubenswrapper[4975]: I0318 12:34:22.410763 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-587979d76d-qg8cs" podStartSLOduration=5.410743847 podStartE2EDuration="5.410743847s" podCreationTimestamp="2026-03-18 12:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:22.375563868 +0000 UTC m=+1448.089964447" watchObservedRunningTime="2026-03-18 12:34:22.410743847 +0000 UTC m=+1448.125144426" Mar 18 12:34:24 crc kubenswrapper[4975]: I0318 12:34:24.342072 4975 generic.go:334] "Generic (PLEG): container finished" podID="4f054ebc-151c-4e89-8242-3837c9bee6b2" containerID="77eec9a0c105490c29fb8bed80b81fecf8fa74501cf1bb4d42fcb5f966879a84" exitCode=0 Mar 18 12:34:24 crc kubenswrapper[4975]: I0318 12:34:24.342169 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-psw5k" event={"ID":"4f054ebc-151c-4e89-8242-3837c9bee6b2","Type":"ContainerDied","Data":"77eec9a0c105490c29fb8bed80b81fecf8fa74501cf1bb4d42fcb5f966879a84"} Mar 18 12:34:24 crc kubenswrapper[4975]: I0318 12:34:24.348366 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m6lmd" event={"ID":"94f94b61-6738-4ab8-a65f-0d6cf4d86be1","Type":"ContainerStarted","Data":"d7fd08ea9e8af2fce4a2d69ad6eefacff16e67cd19e08cf851e60b6cc4dbdb8f"} Mar 18 12:34:24 crc kubenswrapper[4975]: I0318 12:34:24.385719 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-m6lmd" podStartSLOduration=2.757114407 podStartE2EDuration="47.385704401s" podCreationTimestamp="2026-03-18 12:33:37 +0000 UTC" firstStartedPulling="2026-03-18 12:33:38.962655936 +0000 UTC m=+1404.677056515" lastFinishedPulling="2026-03-18 12:34:23.59124593 +0000 UTC m=+1449.305646509" observedRunningTime="2026-03-18 12:34:24.381183927 +0000 UTC m=+1450.095584526" watchObservedRunningTime="2026-03-18 12:34:24.385704401 +0000 UTC m=+1450.100104980" Mar 18 12:34:26 crc kubenswrapper[4975]: I0318 12:34:26.681636 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-c478d4794-x2t7q" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 18 12:34:26 crc kubenswrapper[4975]: I0318 12:34:26.759650 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-56bcd48494-744wv" podUID="939992e3-94eb-4c98-a493-e30321c7f81a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.155:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.155:8443: connect: connection refused" Mar 18 12:34:27 crc kubenswrapper[4975]: I0318 12:34:27.537465 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 12:34:27 crc kubenswrapper[4975]: I0318 12:34:27.537523 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 12:34:27 crc kubenswrapper[4975]: I0318 12:34:27.575555 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 12:34:27 crc kubenswrapper[4975]: I0318 12:34:27.582546 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 12:34:28 crc kubenswrapper[4975]: I0318 12:34:28.399680 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 12:34:28 crc kubenswrapper[4975]: I0318 12:34:28.400086 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 12:34:29 crc kubenswrapper[4975]: I0318 12:34:29.408468 4975 generic.go:334] "Generic (PLEG): container finished" podID="94f94b61-6738-4ab8-a65f-0d6cf4d86be1" containerID="d7fd08ea9e8af2fce4a2d69ad6eefacff16e67cd19e08cf851e60b6cc4dbdb8f" exitCode=0 Mar 18 12:34:29 crc kubenswrapper[4975]: I0318 12:34:29.408569 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m6lmd" event={"ID":"94f94b61-6738-4ab8-a65f-0d6cf4d86be1","Type":"ContainerDied","Data":"d7fd08ea9e8af2fce4a2d69ad6eefacff16e67cd19e08cf851e60b6cc4dbdb8f"} Mar 18 12:34:29 crc kubenswrapper[4975]: I0318 12:34:29.564756 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-psw5k" Mar 18 12:34:29 crc kubenswrapper[4975]: I0318 12:34:29.589859 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4f054ebc-151c-4e89-8242-3837c9bee6b2-db-sync-config-data\") pod \"4f054ebc-151c-4e89-8242-3837c9bee6b2\" (UID: \"4f054ebc-151c-4e89-8242-3837c9bee6b2\") " Mar 18 12:34:29 crc kubenswrapper[4975]: I0318 12:34:29.589951 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f054ebc-151c-4e89-8242-3837c9bee6b2-combined-ca-bundle\") pod \"4f054ebc-151c-4e89-8242-3837c9bee6b2\" (UID: \"4f054ebc-151c-4e89-8242-3837c9bee6b2\") " Mar 18 12:34:29 crc kubenswrapper[4975]: I0318 12:34:29.590026 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmps4\" (UniqueName: \"kubernetes.io/projected/4f054ebc-151c-4e89-8242-3837c9bee6b2-kube-api-access-cmps4\") pod \"4f054ebc-151c-4e89-8242-3837c9bee6b2\" (UID: \"4f054ebc-151c-4e89-8242-3837c9bee6b2\") " Mar 18 12:34:29 crc kubenswrapper[4975]: I0318 12:34:29.593275 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f054ebc-151c-4e89-8242-3837c9bee6b2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4f054ebc-151c-4e89-8242-3837c9bee6b2" (UID: "4f054ebc-151c-4e89-8242-3837c9bee6b2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:29 crc kubenswrapper[4975]: I0318 12:34:29.598155 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f054ebc-151c-4e89-8242-3837c9bee6b2-kube-api-access-cmps4" (OuterVolumeSpecName: "kube-api-access-cmps4") pod "4f054ebc-151c-4e89-8242-3837c9bee6b2" (UID: "4f054ebc-151c-4e89-8242-3837c9bee6b2"). InnerVolumeSpecName "kube-api-access-cmps4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:29 crc kubenswrapper[4975]: I0318 12:34:29.630644 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f054ebc-151c-4e89-8242-3837c9bee6b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f054ebc-151c-4e89-8242-3837c9bee6b2" (UID: "4f054ebc-151c-4e89-8242-3837c9bee6b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:29 crc kubenswrapper[4975]: I0318 12:34:29.693017 4975 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4f054ebc-151c-4e89-8242-3837c9bee6b2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:29 crc kubenswrapper[4975]: I0318 12:34:29.693046 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f054ebc-151c-4e89-8242-3837c9bee6b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:29 crc kubenswrapper[4975]: I0318 12:34:29.693056 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmps4\" (UniqueName: \"kubernetes.io/projected/4f054ebc-151c-4e89-8242-3837c9bee6b2-kube-api-access-cmps4\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:29 crc kubenswrapper[4975]: E0318 12:34:29.841741 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.420465 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d","Type":"ContainerStarted","Data":"4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22"} Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.421795 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.421161 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" containerName="proxy-httpd" containerID="cri-o://4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22" gracePeriod=30 Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.420573 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" containerName="ceilometer-notification-agent" containerID="cri-o://396b11744970cd6c2c092e95db41412f4b8ec80cdf0ce7bb8ef59f504a8da82f" gracePeriod=30 Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.421180 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" containerName="sg-core" containerID="cri-o://464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f" gracePeriod=30 Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.424187 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-psw5k" Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.428573 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-psw5k" event={"ID":"4f054ebc-151c-4e89-8242-3837c9bee6b2","Type":"ContainerDied","Data":"c345efd1145a32559e8c7bd484810752d4faa0548998ff6bb04f08c959f06316"} Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.428614 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c345efd1145a32559e8c7bd484810752d4faa0548998ff6bb04f08c959f06316" Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.610168 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.610322 4975 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.624776 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.886293 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6744587899-pzwjz"] Mar 18 12:34:30 crc kubenswrapper[4975]: E0318 12:34:30.886969 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f054ebc-151c-4e89-8242-3837c9bee6b2" containerName="barbican-db-sync" Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.886987 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f054ebc-151c-4e89-8242-3837c9bee6b2" containerName="barbican-db-sync" Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.887215 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f054ebc-151c-4e89-8242-3837c9bee6b2" containerName="barbican-db-sync" Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.888063 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.899950 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.907788 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.908107 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.908272 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-pjgrx" Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.944860 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6744587899-pzwjz"] Mar 18 12:34:30 crc kubenswrapper[4975]: I0318 12:34:30.999761 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6fc58d8444-6ln2h"] Mar 18 12:34:31 crc kubenswrapper[4975]: E0318 12:34:31.001346 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f94b61-6738-4ab8-a65f-0d6cf4d86be1" containerName="cinder-db-sync" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.001375 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f94b61-6738-4ab8-a65f-0d6cf4d86be1" containerName="cinder-db-sync" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.001610 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="94f94b61-6738-4ab8-a65f-0d6cf4d86be1" containerName="cinder-db-sync" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.002768 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.017204 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.030718 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-db-sync-config-data\") pod \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.030791 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64g7s\" (UniqueName: \"kubernetes.io/projected/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-kube-api-access-64g7s\") pod \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.030979 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-config-data\") pod \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.031031 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-scripts\") pod \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.031125 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-etc-machine-id\") pod \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.031180 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-combined-ca-bundle\") pod \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\" (UID: \"94f94b61-6738-4ab8-a65f-0d6cf4d86be1\") " Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.032623 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "94f94b61-6738-4ab8-a65f-0d6cf4d86be1" (UID: "94f94b61-6738-4ab8-a65f-0d6cf4d86be1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.033795 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e530724-5f58-4bbc-9b9a-624a4565ab21-logs\") pod \"barbican-worker-6744587899-pzwjz\" (UID: \"4e530724-5f58-4bbc-9b9a-624a4565ab21\") " pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.033874 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfgc7\" (UniqueName: \"kubernetes.io/projected/4e530724-5f58-4bbc-9b9a-624a4565ab21-kube-api-access-bfgc7\") pod \"barbican-worker-6744587899-pzwjz\" (UID: \"4e530724-5f58-4bbc-9b9a-624a4565ab21\") " pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.033939 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e530724-5f58-4bbc-9b9a-624a4565ab21-config-data-custom\") pod \"barbican-worker-6744587899-pzwjz\" (UID: \"4e530724-5f58-4bbc-9b9a-624a4565ab21\") " pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.034054 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e530724-5f58-4bbc-9b9a-624a4565ab21-config-data\") pod \"barbican-worker-6744587899-pzwjz\" (UID: \"4e530724-5f58-4bbc-9b9a-624a4565ab21\") " pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.034110 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e530724-5f58-4bbc-9b9a-624a4565ab21-combined-ca-bundle\") pod \"barbican-worker-6744587899-pzwjz\" (UID: \"4e530724-5f58-4bbc-9b9a-624a4565ab21\") " pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.044319 4975 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.050224 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-kube-api-access-64g7s" (OuterVolumeSpecName: "kube-api-access-64g7s") pod "94f94b61-6738-4ab8-a65f-0d6cf4d86be1" (UID: "94f94b61-6738-4ab8-a65f-0d6cf4d86be1"). InnerVolumeSpecName "kube-api-access-64g7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.076131 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-scripts" (OuterVolumeSpecName: "scripts") pod "94f94b61-6738-4ab8-a65f-0d6cf4d86be1" (UID: "94f94b61-6738-4ab8-a65f-0d6cf4d86be1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.076538 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6fc58d8444-6ln2h"] Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.085487 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "94f94b61-6738-4ab8-a65f-0d6cf4d86be1" (UID: "94f94b61-6738-4ab8-a65f-0d6cf4d86be1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.127253 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tqv2s"] Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.138708 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.145894 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9hjf\" (UniqueName: \"kubernetes.io/projected/0d8217c7-38b9-4717-a965-4a408d31fdc6-kube-api-access-n9hjf\") pod \"barbican-keystone-listener-6fc58d8444-6ln2h\" (UID: \"0d8217c7-38b9-4717-a965-4a408d31fdc6\") " pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.145963 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e530724-5f58-4bbc-9b9a-624a4565ab21-logs\") pod \"barbican-worker-6744587899-pzwjz\" (UID: \"4e530724-5f58-4bbc-9b9a-624a4565ab21\") " pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.145997 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfgc7\" (UniqueName: \"kubernetes.io/projected/4e530724-5f58-4bbc-9b9a-624a4565ab21-kube-api-access-bfgc7\") pod \"barbican-worker-6744587899-pzwjz\" (UID: \"4e530724-5f58-4bbc-9b9a-624a4565ab21\") " pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.146030 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e530724-5f58-4bbc-9b9a-624a4565ab21-config-data-custom\") pod \"barbican-worker-6744587899-pzwjz\" (UID: \"4e530724-5f58-4bbc-9b9a-624a4565ab21\") " pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.146087 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d8217c7-38b9-4717-a965-4a408d31fdc6-config-data\") pod \"barbican-keystone-listener-6fc58d8444-6ln2h\" (UID: \"0d8217c7-38b9-4717-a965-4a408d31fdc6\") " pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.146117 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d8217c7-38b9-4717-a965-4a408d31fdc6-config-data-custom\") pod \"barbican-keystone-listener-6fc58d8444-6ln2h\" (UID: \"0d8217c7-38b9-4717-a965-4a408d31fdc6\") " pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.146147 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e530724-5f58-4bbc-9b9a-624a4565ab21-config-data\") pod \"barbican-worker-6744587899-pzwjz\" (UID: \"4e530724-5f58-4bbc-9b9a-624a4565ab21\") " pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.146188 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e530724-5f58-4bbc-9b9a-624a4565ab21-combined-ca-bundle\") pod \"barbican-worker-6744587899-pzwjz\" (UID: \"4e530724-5f58-4bbc-9b9a-624a4565ab21\") " pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.146203 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d8217c7-38b9-4717-a965-4a408d31fdc6-logs\") pod \"barbican-keystone-listener-6fc58d8444-6ln2h\" (UID: \"0d8217c7-38b9-4717-a965-4a408d31fdc6\") " pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.146222 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d8217c7-38b9-4717-a965-4a408d31fdc6-combined-ca-bundle\") pod \"barbican-keystone-listener-6fc58d8444-6ln2h\" (UID: \"0d8217c7-38b9-4717-a965-4a408d31fdc6\") " pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.146282 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64g7s\" (UniqueName: \"kubernetes.io/projected/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-kube-api-access-64g7s\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.146295 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.146304 4975 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.146635 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e530724-5f58-4bbc-9b9a-624a4565ab21-logs\") pod \"barbican-worker-6744587899-pzwjz\" (UID: \"4e530724-5f58-4bbc-9b9a-624a4565ab21\") " pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.163833 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e530724-5f58-4bbc-9b9a-624a4565ab21-combined-ca-bundle\") pod \"barbican-worker-6744587899-pzwjz\" (UID: \"4e530724-5f58-4bbc-9b9a-624a4565ab21\") " pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.170017 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e530724-5f58-4bbc-9b9a-624a4565ab21-config-data\") pod \"barbican-worker-6744587899-pzwjz\" (UID: \"4e530724-5f58-4bbc-9b9a-624a4565ab21\") " pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.196837 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e530724-5f58-4bbc-9b9a-624a4565ab21-config-data-custom\") pod \"barbican-worker-6744587899-pzwjz\" (UID: \"4e530724-5f58-4bbc-9b9a-624a4565ab21\") " pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.198438 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tqv2s"] Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.198495 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfgc7\" (UniqueName: \"kubernetes.io/projected/4e530724-5f58-4bbc-9b9a-624a4565ab21-kube-api-access-bfgc7\") pod \"barbican-worker-6744587899-pzwjz\" (UID: \"4e530724-5f58-4bbc-9b9a-624a4565ab21\") " pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.204028 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94f94b61-6738-4ab8-a65f-0d6cf4d86be1" (UID: "94f94b61-6738-4ab8-a65f-0d6cf4d86be1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.215612 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6744587899-pzwjz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.247453 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9hjf\" (UniqueName: \"kubernetes.io/projected/0d8217c7-38b9-4717-a965-4a408d31fdc6-kube-api-access-n9hjf\") pod \"barbican-keystone-listener-6fc58d8444-6ln2h\" (UID: \"0d8217c7-38b9-4717-a965-4a408d31fdc6\") " pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.247835 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.262439 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6dbdc7b4b-htzr4"] Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.264154 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.265629 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-config-data" (OuterVolumeSpecName: "config-data") pod "94f94b61-6738-4ab8-a65f-0d6cf4d86be1" (UID: "94f94b61-6738-4ab8-a65f-0d6cf4d86be1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.266080 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.266191 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d8217c7-38b9-4717-a965-4a408d31fdc6-config-data\") pod \"barbican-keystone-listener-6fc58d8444-6ln2h\" (UID: \"0d8217c7-38b9-4717-a965-4a408d31fdc6\") " pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.266310 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d8217c7-38b9-4717-a965-4a408d31fdc6-config-data-custom\") pod \"barbican-keystone-listener-6fc58d8444-6ln2h\" (UID: \"0d8217c7-38b9-4717-a965-4a408d31fdc6\") " pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.266411 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.266468 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmnfj\" (UniqueName: \"kubernetes.io/projected/41ef5b42-ab65-428e-8278-408d868b0afa-kube-api-access-kmnfj\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.266589 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d8217c7-38b9-4717-a965-4a408d31fdc6-logs\") pod \"barbican-keystone-listener-6fc58d8444-6ln2h\" (UID: \"0d8217c7-38b9-4717-a965-4a408d31fdc6\") " pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.266649 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d8217c7-38b9-4717-a965-4a408d31fdc6-combined-ca-bundle\") pod \"barbican-keystone-listener-6fc58d8444-6ln2h\" (UID: \"0d8217c7-38b9-4717-a965-4a408d31fdc6\") " pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.266670 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-config\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.266720 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.267050 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.267068 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94f94b61-6738-4ab8-a65f-0d6cf4d86be1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.267550 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d8217c7-38b9-4717-a965-4a408d31fdc6-logs\") pod \"barbican-keystone-listener-6fc58d8444-6ln2h\" (UID: \"0d8217c7-38b9-4717-a965-4a408d31fdc6\") " pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.271832 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d8217c7-38b9-4717-a965-4a408d31fdc6-combined-ca-bundle\") pod \"barbican-keystone-listener-6fc58d8444-6ln2h\" (UID: \"0d8217c7-38b9-4717-a965-4a408d31fdc6\") " pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.271973 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d8217c7-38b9-4717-a965-4a408d31fdc6-config-data\") pod \"barbican-keystone-listener-6fc58d8444-6ln2h\" (UID: \"0d8217c7-38b9-4717-a965-4a408d31fdc6\") " pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.275294 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.278250 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d8217c7-38b9-4717-a965-4a408d31fdc6-config-data-custom\") pod \"barbican-keystone-listener-6fc58d8444-6ln2h\" (UID: \"0d8217c7-38b9-4717-a965-4a408d31fdc6\") " pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.306155 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dbdc7b4b-htzr4"] Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.320498 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9hjf\" (UniqueName: \"kubernetes.io/projected/0d8217c7-38b9-4717-a965-4a408d31fdc6-kube-api-access-n9hjf\") pod \"barbican-keystone-listener-6fc58d8444-6ln2h\" (UID: \"0d8217c7-38b9-4717-a965-4a408d31fdc6\") " pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.354494 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.368518 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.368606 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-config-data-custom\") pod \"barbican-api-6dbdc7b4b-htzr4\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.368638 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdj4g\" (UniqueName: \"kubernetes.io/projected/5bae225c-6f49-427e-b276-8b0ffdff525f-kube-api-access-wdj4g\") pod \"barbican-api-6dbdc7b4b-htzr4\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.368679 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-config-data\") pod \"barbican-api-6dbdc7b4b-htzr4\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.368710 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.368758 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-combined-ca-bundle\") pod \"barbican-api-6dbdc7b4b-htzr4\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.368812 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.368841 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bae225c-6f49-427e-b276-8b0ffdff525f-logs\") pod \"barbican-api-6dbdc7b4b-htzr4\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.369145 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmnfj\" (UniqueName: \"kubernetes.io/projected/41ef5b42-ab65-428e-8278-408d868b0afa-kube-api-access-kmnfj\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.369227 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-config\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.369268 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.369772 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.370322 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.370408 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.371103 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.371308 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-config\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.406939 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmnfj\" (UniqueName: \"kubernetes.io/projected/41ef5b42-ab65-428e-8278-408d868b0afa-kube-api-access-kmnfj\") pod \"dnsmasq-dns-85ff748b95-tqv2s\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.468529 4975 generic.go:334] "Generic (PLEG): container finished" podID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" containerID="4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22" exitCode=0 Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.468828 4975 generic.go:334] "Generic (PLEG): container finished" podID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" containerID="464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f" exitCode=2 Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.468962 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d","Type":"ContainerDied","Data":"4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22"} Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.468996 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d","Type":"ContainerDied","Data":"464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f"} Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.470897 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-config-data-custom\") pod \"barbican-api-6dbdc7b4b-htzr4\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.470943 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdj4g\" (UniqueName: \"kubernetes.io/projected/5bae225c-6f49-427e-b276-8b0ffdff525f-kube-api-access-wdj4g\") pod \"barbican-api-6dbdc7b4b-htzr4\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.470984 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-config-data\") pod \"barbican-api-6dbdc7b4b-htzr4\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.471030 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-combined-ca-bundle\") pod \"barbican-api-6dbdc7b4b-htzr4\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.471080 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bae225c-6f49-427e-b276-8b0ffdff525f-logs\") pod \"barbican-api-6dbdc7b4b-htzr4\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.471489 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bae225c-6f49-427e-b276-8b0ffdff525f-logs\") pod \"barbican-api-6dbdc7b4b-htzr4\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.479335 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m6lmd" event={"ID":"94f94b61-6738-4ab8-a65f-0d6cf4d86be1","Type":"ContainerDied","Data":"9ac6e2a53ea68ab3170f3fd8b946e7df4179f5d76279d1aa19efc51ee1c7d350"} Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.479380 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ac6e2a53ea68ab3170f3fd8b946e7df4179f5d76279d1aa19efc51ee1c7d350" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.479525 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m6lmd" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.497114 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-config-data\") pod \"barbican-api-6dbdc7b4b-htzr4\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.498418 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-combined-ca-bundle\") pod \"barbican-api-6dbdc7b4b-htzr4\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.502473 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-config-data-custom\") pod \"barbican-api-6dbdc7b4b-htzr4\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.506116 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.518607 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdj4g\" (UniqueName: \"kubernetes.io/projected/5bae225c-6f49-427e-b276-8b0ffdff525f-kube-api-access-wdj4g\") pod \"barbican-api-6dbdc7b4b-htzr4\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.628321 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.737807 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.739775 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.759735 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.760006 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.760142 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.760312 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hwdsz" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.806332 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.865554 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tqv2s"] Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.886962 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.887034 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-scripts\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.887074 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv9vv\" (UniqueName: \"kubernetes.io/projected/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-kube-api-access-vv9vv\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.887154 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-config-data\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.887203 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.887393 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.989656 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.989733 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.989767 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-scripts\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.989797 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv9vv\" (UniqueName: \"kubernetes.io/projected/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-kube-api-access-vv9vv\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.989838 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-config-data\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.989865 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:31 crc kubenswrapper[4975]: I0318 12:34:31.996522 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.006767 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.023597 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-scripts\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.025212 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv9vv\" (UniqueName: \"kubernetes.io/projected/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-kube-api-access-vv9vv\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.025236 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-6427m"] Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.042142 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.047977 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-6427m"] Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.051260 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-config-data\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.059724 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.078048 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.079463 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.092430 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.100837 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.137391 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.194398 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzvbl\" (UniqueName: \"kubernetes.io/projected/028f5562-456a-4f96-a868-7c2a7aff3d3e-kube-api-access-vzvbl\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.194451 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k566f\" (UniqueName: \"kubernetes.io/projected/1d5e1038-c27a-4d75-bce4-997028e690eb-kube-api-access-k566f\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.194498 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.194522 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.194604 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/028f5562-456a-4f96-a868-7c2a7aff3d3e-logs\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.194628 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-config-data-custom\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.194673 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.194690 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.194735 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/028f5562-456a-4f96-a868-7c2a7aff3d3e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.194752 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-config\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.194781 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-config-data\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.194798 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.194819 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-scripts\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.237989 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6744587899-pzwjz"] Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.297332 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.297377 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.297419 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/028f5562-456a-4f96-a868-7c2a7aff3d3e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.297439 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-config\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.297466 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-config-data\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.297480 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.297495 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-scripts\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.297512 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzvbl\" (UniqueName: \"kubernetes.io/projected/028f5562-456a-4f96-a868-7c2a7aff3d3e-kube-api-access-vzvbl\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.297530 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k566f\" (UniqueName: \"kubernetes.io/projected/1d5e1038-c27a-4d75-bce4-997028e690eb-kube-api-access-k566f\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.297565 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.297582 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.297636 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/028f5562-456a-4f96-a868-7c2a7aff3d3e-logs\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.297657 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-config-data-custom\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.299123 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/028f5562-456a-4f96-a868-7c2a7aff3d3e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.299245 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.299970 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.300455 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-config\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.301752 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.302352 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.302609 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/028f5562-456a-4f96-a868-7c2a7aff3d3e-logs\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.307269 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.311621 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-config-data-custom\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.319184 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-scripts\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.322383 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k566f\" (UniqueName: \"kubernetes.io/projected/1d5e1038-c27a-4d75-bce4-997028e690eb-kube-api-access-k566f\") pod \"dnsmasq-dns-5c9776ccc5-6427m\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.328099 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-config-data\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.337566 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzvbl\" (UniqueName: \"kubernetes.io/projected/028f5562-456a-4f96-a868-7c2a7aff3d3e-kube-api-access-vzvbl\") pod \"cinder-api-0\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.338006 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6fc58d8444-6ln2h"] Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.413602 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.438487 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.464062 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.540196 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tqv2s"] Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.582268 4975 generic.go:334] "Generic (PLEG): container finished" podID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" containerID="396b11744970cd6c2c092e95db41412f4b8ec80cdf0ce7bb8ef59f504a8da82f" exitCode=0 Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.582392 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d","Type":"ContainerDied","Data":"396b11744970cd6c2c092e95db41412f4b8ec80cdf0ce7bb8ef59f504a8da82f"} Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.582420 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d","Type":"ContainerDied","Data":"2df49e2ece685ef95ed0fda6cc561facdcbc6aeed45b80ca461f4960209b3ebf"} Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.582435 4975 scope.go:117] "RemoveContainer" containerID="4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.582565 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.606612 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-run-httpd\") pod \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.606661 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-sg-core-conf-yaml\") pod \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.606724 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-combined-ca-bundle\") pod \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.606752 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-scripts\") pod \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.606801 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhb2m\" (UniqueName: \"kubernetes.io/projected/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-kube-api-access-nhb2m\") pod \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.606825 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-log-httpd\") pod \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.606972 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-config-data\") pod \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\" (UID: \"b8ef7f51-157e-4b7c-95ce-7d8655c5c78d\") " Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.607201 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" (UID: "b8ef7f51-157e-4b7c-95ce-7d8655c5c78d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.608042 4975 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.610963 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6744587899-pzwjz" event={"ID":"4e530724-5f58-4bbc-9b9a-624a4565ab21","Type":"ContainerStarted","Data":"42cb7bf697b9630334bdc9fb8031b577ed659e5eabc8069d53d3e78ae9e3a002"} Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.612305 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" (UID: "b8ef7f51-157e-4b7c-95ce-7d8655c5c78d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.628065 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-scripts" (OuterVolumeSpecName: "scripts") pod "b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" (UID: "b8ef7f51-157e-4b7c-95ce-7d8655c5c78d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.628302 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" event={"ID":"0d8217c7-38b9-4717-a965-4a408d31fdc6","Type":"ContainerStarted","Data":"e6af18c9c3fe7780234f9f87d2c937cbf2a8261357630ffaf2db60af4503314d"} Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.628492 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-kube-api-access-nhb2m" (OuterVolumeSpecName: "kube-api-access-nhb2m") pod "b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" (UID: "b8ef7f51-157e-4b7c-95ce-7d8655c5c78d"). InnerVolumeSpecName "kube-api-access-nhb2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.652853 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dbdc7b4b-htzr4"] Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.658361 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" (UID: "b8ef7f51-157e-4b7c-95ce-7d8655c5c78d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.711161 4975 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.711191 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.711203 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhb2m\" (UniqueName: \"kubernetes.io/projected/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-kube-api-access-nhb2m\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.711213 4975 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.771152 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-config-data" (OuterVolumeSpecName: "config-data") pod "b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" (UID: "b8ef7f51-157e-4b7c-95ce-7d8655c5c78d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.813385 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.814098 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.868291 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" (UID: "b8ef7f51-157e-4b7c-95ce-7d8655c5c78d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.879994 4975 scope.go:117] "RemoveContainer" containerID="464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.922704 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:32 crc kubenswrapper[4975]: I0318 12:34:32.942606 4975 scope.go:117] "RemoveContainer" containerID="396b11744970cd6c2c092e95db41412f4b8ec80cdf0ce7bb8ef59f504a8da82f" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.012815 4975 scope.go:117] "RemoveContainer" containerID="4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.012966 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-6427m"] Mar 18 12:34:33 crc kubenswrapper[4975]: E0318 12:34:33.013456 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22\": container with ID starting with 4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22 not found: ID does not exist" containerID="4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.013487 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22"} err="failed to get container status \"4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22\": rpc error: code = NotFound desc = could not find container \"4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22\": container with ID starting with 4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22 not found: ID does not exist" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.013511 4975 scope.go:117] "RemoveContainer" containerID="464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f" Mar 18 12:34:33 crc kubenswrapper[4975]: E0318 12:34:33.013829 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f\": container with ID starting with 464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f not found: ID does not exist" containerID="464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.013855 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f"} err="failed to get container status \"464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f\": rpc error: code = NotFound desc = could not find container \"464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f\": container with ID starting with 464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f not found: ID does not exist" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.013891 4975 scope.go:117] "RemoveContainer" containerID="396b11744970cd6c2c092e95db41412f4b8ec80cdf0ce7bb8ef59f504a8da82f" Mar 18 12:34:33 crc kubenswrapper[4975]: E0318 12:34:33.014235 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"396b11744970cd6c2c092e95db41412f4b8ec80cdf0ce7bb8ef59f504a8da82f\": container with ID starting with 396b11744970cd6c2c092e95db41412f4b8ec80cdf0ce7bb8ef59f504a8da82f not found: ID does not exist" containerID="396b11744970cd6c2c092e95db41412f4b8ec80cdf0ce7bb8ef59f504a8da82f" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.014263 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"396b11744970cd6c2c092e95db41412f4b8ec80cdf0ce7bb8ef59f504a8da82f"} err="failed to get container status \"396b11744970cd6c2c092e95db41412f4b8ec80cdf0ce7bb8ef59f504a8da82f\": rpc error: code = NotFound desc = could not find container \"396b11744970cd6c2c092e95db41412f4b8ec80cdf0ce7bb8ef59f504a8da82f\": container with ID starting with 396b11744970cd6c2c092e95db41412f4b8ec80cdf0ce7bb8ef59f504a8da82f not found: ID does not exist" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.041836 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.055620 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.084987 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:33 crc kubenswrapper[4975]: E0318 12:34:33.085437 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" containerName="proxy-httpd" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.085459 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" containerName="proxy-httpd" Mar 18 12:34:33 crc kubenswrapper[4975]: E0318 12:34:33.085481 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" containerName="ceilometer-notification-agent" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.085490 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" containerName="ceilometer-notification-agent" Mar 18 12:34:33 crc kubenswrapper[4975]: E0318 12:34:33.085528 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" containerName="sg-core" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.085536 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" containerName="sg-core" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.085748 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" containerName="proxy-httpd" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.085777 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" containerName="ceilometer-notification-agent" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.085803 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" containerName="sg-core" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.087640 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.093995 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.095197 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.098478 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.233497 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-config-data\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.233814 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.233846 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-run-httpd\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.233894 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-log-httpd\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.233919 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-scripts\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.233935 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ktqj\" (UniqueName: \"kubernetes.io/projected/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-kube-api-access-2ktqj\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.234001 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.241633 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.336114 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-config-data\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.336181 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.336216 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-run-httpd\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.336250 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-log-httpd\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.336741 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-run-httpd\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.337027 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-log-httpd\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.337085 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-scripts\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.337108 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ktqj\" (UniqueName: \"kubernetes.io/projected/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-kube-api-access-2ktqj\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.337487 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.343038 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.344967 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-scripts\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.346983 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.363694 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-config-data\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.367018 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ktqj\" (UniqueName: \"kubernetes.io/projected/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-kube-api-access-2ktqj\") pod \"ceilometer-0\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.451421 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.736201 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dbdc7b4b-htzr4" event={"ID":"5bae225c-6f49-427e-b276-8b0ffdff525f","Type":"ContainerStarted","Data":"990b71650fdbf831a485b7015b36af538d557ca29dfc832424d0a94919fe9601"} Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.737056 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dbdc7b4b-htzr4" event={"ID":"5bae225c-6f49-427e-b276-8b0ffdff525f","Type":"ContainerStarted","Data":"478fa8c0f210ab3779ca88d517690908ac3aec64b8c063aa7f96a342a467df93"} Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.737079 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dbdc7b4b-htzr4" event={"ID":"5bae225c-6f49-427e-b276-8b0ffdff525f","Type":"ContainerStarted","Data":"f4e4efe5b629031fa517717abf21698944718c9d78185e72d9061e46578d7076"} Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.739613 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.739658 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.747946 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"028f5562-456a-4f96-a868-7c2a7aff3d3e","Type":"ContainerStarted","Data":"ab0f6b4129e8c91eb64ce36bf2976c287373c5e18d69d50c2af409a47bbbbb3c"} Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.773796 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" event={"ID":"1d5e1038-c27a-4d75-bce4-997028e690eb","Type":"ContainerStarted","Data":"69684312662f8c0fcd5275e7a89baece158ab301af1491394963ecf7107944d1"} Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.773853 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" event={"ID":"1d5e1038-c27a-4d75-bce4-997028e690eb","Type":"ContainerStarted","Data":"2958a36efedc27043120bb88e4de2b97fbf01809ec971065fd5b0f97b7e2cbd7"} Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.787973 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6dbdc7b4b-htzr4" podStartSLOduration=2.787945488 podStartE2EDuration="2.787945488s" podCreationTimestamp="2026-03-18 12:34:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:33.771713311 +0000 UTC m=+1459.486113890" watchObservedRunningTime="2026-03-18 12:34:33.787945488 +0000 UTC m=+1459.502346077" Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.827933 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd","Type":"ContainerStarted","Data":"0c61d8a7b950372a71dca37bef9dd12650e3a4beaea0431763220f75404b21b7"} Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.939166 4975 generic.go:334] "Generic (PLEG): container finished" podID="41ef5b42-ab65-428e-8278-408d868b0afa" containerID="176d8920b3aceb2bfd93337475b7fae3b86f788f2adaf07b38080c2da8ebd853" exitCode=0 Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.939219 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" event={"ID":"41ef5b42-ab65-428e-8278-408d868b0afa","Type":"ContainerDied","Data":"176d8920b3aceb2bfd93337475b7fae3b86f788f2adaf07b38080c2da8ebd853"} Mar 18 12:34:33 crc kubenswrapper[4975]: I0318 12:34:33.939248 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" event={"ID":"41ef5b42-ab65-428e-8278-408d868b0afa","Type":"ContainerStarted","Data":"7eaed2bec95b5c4eff3a41ad0c5f2bc8df31f3ee251a8c495f6694bd8cd862a2"} Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.109226 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.185654 4975 scope.go:117] "RemoveContainer" containerID="43d38ad3d8849c53de8ed8f05e835cd685ee0a1b677e65c47e94f8d76b14ed2e" Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.646470 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.679185 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmnfj\" (UniqueName: \"kubernetes.io/projected/41ef5b42-ab65-428e-8278-408d868b0afa-kube-api-access-kmnfj\") pod \"41ef5b42-ab65-428e-8278-408d868b0afa\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.679366 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-dns-swift-storage-0\") pod \"41ef5b42-ab65-428e-8278-408d868b0afa\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.679393 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-dns-svc\") pod \"41ef5b42-ab65-428e-8278-408d868b0afa\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.679447 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-ovsdbserver-sb\") pod \"41ef5b42-ab65-428e-8278-408d868b0afa\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.679492 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-config\") pod \"41ef5b42-ab65-428e-8278-408d868b0afa\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.679544 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-ovsdbserver-nb\") pod \"41ef5b42-ab65-428e-8278-408d868b0afa\" (UID: \"41ef5b42-ab65-428e-8278-408d868b0afa\") " Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.689295 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41ef5b42-ab65-428e-8278-408d868b0afa-kube-api-access-kmnfj" (OuterVolumeSpecName: "kube-api-access-kmnfj") pod "41ef5b42-ab65-428e-8278-408d868b0afa" (UID: "41ef5b42-ab65-428e-8278-408d868b0afa"). InnerVolumeSpecName "kube-api-access-kmnfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.753333 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "41ef5b42-ab65-428e-8278-408d868b0afa" (UID: "41ef5b42-ab65-428e-8278-408d868b0afa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.769660 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "41ef5b42-ab65-428e-8278-408d868b0afa" (UID: "41ef5b42-ab65-428e-8278-408d868b0afa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.786243 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmnfj\" (UniqueName: \"kubernetes.io/projected/41ef5b42-ab65-428e-8278-408d868b0afa-kube-api-access-kmnfj\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.786288 4975 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.786300 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.794960 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-config" (OuterVolumeSpecName: "config") pod "41ef5b42-ab65-428e-8278-408d868b0afa" (UID: "41ef5b42-ab65-428e-8278-408d868b0afa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.822998 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "41ef5b42-ab65-428e-8278-408d868b0afa" (UID: "41ef5b42-ab65-428e-8278-408d868b0afa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.836430 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "41ef5b42-ab65-428e-8278-408d868b0afa" (UID: "41ef5b42-ab65-428e-8278-408d868b0afa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.887492 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.887526 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.887536 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ef5b42-ab65-428e-8278-408d868b0afa-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.961333 4975 generic.go:334] "Generic (PLEG): container finished" podID="1d5e1038-c27a-4d75-bce4-997028e690eb" containerID="69684312662f8c0fcd5275e7a89baece158ab301af1491394963ecf7107944d1" exitCode=0 Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.962320 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" event={"ID":"1d5e1038-c27a-4d75-bce4-997028e690eb","Type":"ContainerDied","Data":"69684312662f8c0fcd5275e7a89baece158ab301af1491394963ecf7107944d1"} Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.968412 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d","Type":"ContainerStarted","Data":"5c075acc505640b0956ed5e4eaa147d08aa7d44358fb35ac701c8e5ec58e3b93"} Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.970550 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" event={"ID":"41ef5b42-ab65-428e-8278-408d868b0afa","Type":"ContainerDied","Data":"7eaed2bec95b5c4eff3a41ad0c5f2bc8df31f3ee251a8c495f6694bd8cd862a2"} Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.970595 4975 scope.go:117] "RemoveContainer" containerID="176d8920b3aceb2bfd93337475b7fae3b86f788f2adaf07b38080c2da8ebd853" Mar 18 12:34:34 crc kubenswrapper[4975]: I0318 12:34:34.970700 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tqv2s" Mar 18 12:34:35 crc kubenswrapper[4975]: I0318 12:34:35.003081 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"028f5562-456a-4f96-a868-7c2a7aff3d3e","Type":"ContainerStarted","Data":"7b91981ea53efd61f45f259d4fbd9b74f79839543a2c37dec8fa396a1f9f9910"} Mar 18 12:34:35 crc kubenswrapper[4975]: I0318 12:34:35.054486 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ef7f51-157e-4b7c-95ce-7d8655c5c78d" path="/var/lib/kubelet/pods/b8ef7f51-157e-4b7c-95ce-7d8655c5c78d/volumes" Mar 18 12:34:35 crc kubenswrapper[4975]: I0318 12:34:35.119588 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tqv2s"] Mar 18 12:34:35 crc kubenswrapper[4975]: I0318 12:34:35.138015 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tqv2s"] Mar 18 12:34:35 crc kubenswrapper[4975]: I0318 12:34:35.249211 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:34:36 crc kubenswrapper[4975]: I0318 12:34:36.679794 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-c478d4794-x2t7q" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 18 12:34:36 crc kubenswrapper[4975]: I0318 12:34:36.757055 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-56bcd48494-744wv" podUID="939992e3-94eb-4c98-a493-e30321c7f81a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.155:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.155:8443: connect: connection refused" Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.066440 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ef5b42-ab65-428e-8278-408d868b0afa" path="/var/lib/kubelet/pods/41ef5b42-ab65-428e-8278-408d868b0afa/volumes" Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.078937 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd","Type":"ContainerStarted","Data":"642e2852d16bd31017d771c002487e405e4cae85dfc9dd86dd5a68b60b072411"} Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.084134 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d","Type":"ContainerStarted","Data":"7ffc4eeaee0df340d88f01c5618567ead919fe55902f919405a028af2b1fd5f5"} Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.106386 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6744587899-pzwjz" event={"ID":"4e530724-5f58-4bbc-9b9a-624a4565ab21","Type":"ContainerStarted","Data":"8d18ea16955f4158f00b577c3283487d5ea7da0bb172b3dcde782d632d0a5c46"} Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.106429 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6744587899-pzwjz" event={"ID":"4e530724-5f58-4bbc-9b9a-624a4565ab21","Type":"ContainerStarted","Data":"f042269dbd3b923d59a7b838161479a975d2224e4562a54504bacc93951f8725"} Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.110689 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"028f5562-456a-4f96-a868-7c2a7aff3d3e","Type":"ContainerStarted","Data":"da767122d672519a4fc2d9585ab82059079e90f847478320c934e6490fd50754"} Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.110817 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="028f5562-456a-4f96-a868-7c2a7aff3d3e" containerName="cinder-api-log" containerID="cri-o://7b91981ea53efd61f45f259d4fbd9b74f79839543a2c37dec8fa396a1f9f9910" gracePeriod=30 Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.110997 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.111032 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="028f5562-456a-4f96-a868-7c2a7aff3d3e" containerName="cinder-api" containerID="cri-o://da767122d672519a4fc2d9585ab82059079e90f847478320c934e6490fd50754" gracePeriod=30 Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.115106 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" event={"ID":"0d8217c7-38b9-4717-a965-4a408d31fdc6","Type":"ContainerStarted","Data":"d72762928ffa81e7ee4d6ef1b3efe6ef98171474f2d0ab56ef73d03d4ab30bc4"} Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.115139 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" event={"ID":"0d8217c7-38b9-4717-a965-4a408d31fdc6","Type":"ContainerStarted","Data":"6113c3b33f877d5e2b74229b73d39e481c4d5a561993daf2c20ccf8968890a52"} Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.117649 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" event={"ID":"1d5e1038-c27a-4d75-bce4-997028e690eb","Type":"ContainerStarted","Data":"009864e170d37ee943f4531cd6bc15601a316a6769327113cb2eaced6d4816b6"} Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.118182 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.129698 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6744587899-pzwjz" podStartSLOduration=3.461874388 podStartE2EDuration="7.129680227s" podCreationTimestamp="2026-03-18 12:34:30 +0000 UTC" firstStartedPulling="2026-03-18 12:34:32.241379953 +0000 UTC m=+1457.955780532" lastFinishedPulling="2026-03-18 12:34:35.909185792 +0000 UTC m=+1461.623586371" observedRunningTime="2026-03-18 12:34:37.12399924 +0000 UTC m=+1462.838399819" watchObservedRunningTime="2026-03-18 12:34:37.129680227 +0000 UTC m=+1462.844080806" Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.168753 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" podStartSLOduration=6.168735332 podStartE2EDuration="6.168735332s" podCreationTimestamp="2026-03-18 12:34:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:37.167153229 +0000 UTC m=+1462.881553808" watchObservedRunningTime="2026-03-18 12:34:37.168735332 +0000 UTC m=+1462.883135911" Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.236178 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6fc58d8444-6ln2h" podStartSLOduration=3.686153944 podStartE2EDuration="7.236153259s" podCreationTimestamp="2026-03-18 12:34:30 +0000 UTC" firstStartedPulling="2026-03-18 12:34:32.366324314 +0000 UTC m=+1458.080724893" lastFinishedPulling="2026-03-18 12:34:35.916323629 +0000 UTC m=+1461.630724208" observedRunningTime="2026-03-18 12:34:37.225165876 +0000 UTC m=+1462.939566465" watchObservedRunningTime="2026-03-18 12:34:37.236153259 +0000 UTC m=+1462.950553838" Mar 18 12:34:37 crc kubenswrapper[4975]: I0318 12:34:37.237703 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.237695832 podStartE2EDuration="6.237695832s" podCreationTimestamp="2026-03-18 12:34:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:37.205473604 +0000 UTC m=+1462.919874193" watchObservedRunningTime="2026-03-18 12:34:37.237695832 +0000 UTC m=+1462.952096421" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.130470 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d","Type":"ContainerStarted","Data":"6a838c075b2bea6c6be343c67c08d11d75024c0fc01e6722a50dff5e4aea1e35"} Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.130738 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d","Type":"ContainerStarted","Data":"5f8d5b3ce07a0b3916b649e9a7c24e65c83acc926b9cacb2a890136d6ee2ae68"} Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.132720 4975 generic.go:334] "Generic (PLEG): container finished" podID="028f5562-456a-4f96-a868-7c2a7aff3d3e" containerID="7b91981ea53efd61f45f259d4fbd9b74f79839543a2c37dec8fa396a1f9f9910" exitCode=143 Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.132771 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"028f5562-456a-4f96-a868-7c2a7aff3d3e","Type":"ContainerDied","Data":"7b91981ea53efd61f45f259d4fbd9b74f79839543a2c37dec8fa396a1f9f9910"} Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.134921 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd","Type":"ContainerStarted","Data":"f8b378d86645ee66d48fd1b006bf435578d9a300ec9db8a1a7f64c30f51792f3"} Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.156289 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.546457774 podStartE2EDuration="7.156268871s" podCreationTimestamp="2026-03-18 12:34:31 +0000 UTC" firstStartedPulling="2026-03-18 12:34:32.879831738 +0000 UTC m=+1458.594232317" lastFinishedPulling="2026-03-18 12:34:34.489642835 +0000 UTC m=+1460.204043414" observedRunningTime="2026-03-18 12:34:38.154529423 +0000 UTC m=+1463.868930002" watchObservedRunningTime="2026-03-18 12:34:38.156268871 +0000 UTC m=+1463.870669450" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.555818 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b44bbdbf8-vkj8f"] Mar 18 12:34:38 crc kubenswrapper[4975]: E0318 12:34:38.556619 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ef5b42-ab65-428e-8278-408d868b0afa" containerName="init" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.556641 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ef5b42-ab65-428e-8278-408d868b0afa" containerName="init" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.556853 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ef5b42-ab65-428e-8278-408d868b0afa" containerName="init" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.558116 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.561476 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.563148 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.565521 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-logs\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.565585 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-config-data\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.565673 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-public-tls-certs\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.565726 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-config-data-custom\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.565850 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjmrs\" (UniqueName: \"kubernetes.io/projected/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-kube-api-access-sjmrs\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.565930 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-combined-ca-bundle\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.565989 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-internal-tls-certs\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.605892 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b44bbdbf8-vkj8f"] Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.667409 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-public-tls-certs\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.667483 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-config-data-custom\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.667530 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjmrs\" (UniqueName: \"kubernetes.io/projected/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-kube-api-access-sjmrs\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.667553 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-combined-ca-bundle\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.667590 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-internal-tls-certs\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.667642 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-logs\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.667688 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-config-data\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.668637 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-logs\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.673929 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-combined-ca-bundle\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.677191 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-config-data\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.678521 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-config-data-custom\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.686003 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-internal-tls-certs\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.686344 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-public-tls-certs\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.691010 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjmrs\" (UniqueName: \"kubernetes.io/projected/ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf-kube-api-access-sjmrs\") pod \"barbican-api-7b44bbdbf8-vkj8f\" (UID: \"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf\") " pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.899947 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:38 crc kubenswrapper[4975]: I0318 12:34:38.996884 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:34:39 crc kubenswrapper[4975]: E0318 12:34:39.282974 4975 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/57043f6e6c0a1cc8af6914467df0dec08e1e6ef1c9d316f2b94dae71b2257351/diff" to get inode usage: stat /var/lib/containers/storage/overlay/57043f6e6c0a1cc8af6914467df0dec08e1e6ef1c9d316f2b94dae71b2257351/diff: no such file or directory, extraDiskErr: Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.426299 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-645d765cf7-6vwpp"] Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.426574 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-645d765cf7-6vwpp" podUID="136388c3-08f6-404b-9a43-d68687d762c8" containerName="neutron-api" containerID="cri-o://0b002b0ac8c9c3a8c4697ac79ac60aad1dddf8d8c6ba6033e81207d9607b7e11" gracePeriod=30 Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.426700 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-645d765cf7-6vwpp" podUID="136388c3-08f6-404b-9a43-d68687d762c8" containerName="neutron-httpd" containerID="cri-o://6f727425465df55d2fac6f8f433eef3e30b04a52f9113ea6c12842151987a9bf" gracePeriod=30 Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.450137 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-645d765cf7-6vwpp" podUID="136388c3-08f6-404b-9a43-d68687d762c8" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.161:9696/\": EOF" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.482032 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67fdc8889-2cm4h"] Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.484027 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.493085 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67fdc8889-2cm4h"] Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.593129 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-ovndb-tls-certs\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.593199 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvszx\" (UniqueName: \"kubernetes.io/projected/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-kube-api-access-dvszx\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.593239 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-public-tls-certs\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.593423 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-config\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.593538 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-combined-ca-bundle\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.593599 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-internal-tls-certs\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.593665 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-httpd-config\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.638139 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b44bbdbf8-vkj8f"] Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.695882 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-ovndb-tls-certs\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.695928 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvszx\" (UniqueName: \"kubernetes.io/projected/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-kube-api-access-dvszx\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.695953 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-public-tls-certs\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.696006 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-config\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.696043 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-combined-ca-bundle\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.696069 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-internal-tls-certs\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.696100 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-httpd-config\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.702036 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-config\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.703240 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-internal-tls-certs\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.703630 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-ovndb-tls-certs\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.703785 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-combined-ca-bundle\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.703964 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-public-tls-certs\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.704038 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-httpd-config\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.715265 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvszx\" (UniqueName: \"kubernetes.io/projected/b6ab8f83-7ff0-46e4-9045-611e4b3b97c0-kube-api-access-dvszx\") pod \"neutron-67fdc8889-2cm4h\" (UID: \"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0\") " pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: W0318 12:34:39.780243 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8ef7f51_157e_4b7c_95ce_7d8655c5c78d.slice/crio-464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f.scope WatchSource:0}: Error finding container 464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f: Status 404 returned error can't find the container with id 464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f Mar 18 12:34:39 crc kubenswrapper[4975]: W0318 12:34:39.782142 4975 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94f94b61_6738_4ab8_a65f_0d6cf4d86be1.slice/crio-conmon-d7fd08ea9e8af2fce4a2d69ad6eefacff16e67cd19e08cf851e60b6cc4dbdb8f.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94f94b61_6738_4ab8_a65f_0d6cf4d86be1.slice/crio-conmon-d7fd08ea9e8af2fce4a2d69ad6eefacff16e67cd19e08cf851e60b6cc4dbdb8f.scope: no such file or directory Mar 18 12:34:39 crc kubenswrapper[4975]: W0318 12:34:39.782184 4975 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94f94b61_6738_4ab8_a65f_0d6cf4d86be1.slice/crio-d7fd08ea9e8af2fce4a2d69ad6eefacff16e67cd19e08cf851e60b6cc4dbdb8f.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94f94b61_6738_4ab8_a65f_0d6cf4d86be1.slice/crio-d7fd08ea9e8af2fce4a2d69ad6eefacff16e67cd19e08cf851e60b6cc4dbdb8f.scope: no such file or directory Mar 18 12:34:39 crc kubenswrapper[4975]: W0318 12:34:39.782202 4975 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8ef7f51_157e_4b7c_95ce_7d8655c5c78d.slice/crio-conmon-4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8ef7f51_157e_4b7c_95ce_7d8655c5c78d.slice/crio-conmon-4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22.scope: no such file or directory Mar 18 12:34:39 crc kubenswrapper[4975]: W0318 12:34:39.782215 4975 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8ef7f51_157e_4b7c_95ce_7d8655c5c78d.slice/crio-4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8ef7f51_157e_4b7c_95ce_7d8655c5c78d.slice/crio-4f26d7bc2d1f3977fcd48b5807ec9b5db821a4dc27c43c301cc23b4435cf1e22.scope: no such file or directory Mar 18 12:34:39 crc kubenswrapper[4975]: W0318 12:34:39.799032 4975 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41ef5b42_ab65_428e_8278_408d868b0afa.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41ef5b42_ab65_428e_8278_408d868b0afa.slice: no such file or directory Mar 18 12:34:39 crc kubenswrapper[4975]: I0318 12:34:39.812724 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:39 crc kubenswrapper[4975]: E0318 12:34:39.985340 4975 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f054ebc_151c_4e89_8242_3837c9bee6b2.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f054ebc_151c_4e89_8242_3837c9bee6b2.slice/crio-conmon-77eec9a0c105490c29fb8bed80b81fecf8fa74501cf1bb4d42fcb5f966879a84.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94f94b61_6738_4ab8_a65f_0d6cf4d86be1.slice/crio-9ac6e2a53ea68ab3170f3fd8b946e7df4179f5d76279d1aa19efc51ee1c7d350\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f054ebc_151c_4e89_8242_3837c9bee6b2.slice/crio-c345efd1145a32559e8c7bd484810752d4faa0548998ff6bb04f08c959f06316\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8ef7f51_157e_4b7c_95ce_7d8655c5c78d.slice/crio-conmon-396b11744970cd6c2c092e95db41412f4b8ec80cdf0ce7bb8ef59f504a8da82f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f054ebc_151c_4e89_8242_3837c9bee6b2.slice/crio-77eec9a0c105490c29fb8bed80b81fecf8fa74501cf1bb4d42fcb5f966879a84.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8ef7f51_157e_4b7c_95ce_7d8655c5c78d.slice/crio-396b11744970cd6c2c092e95db41412f4b8ec80cdf0ce7bb8ef59f504a8da82f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94f94b61_6738_4ab8_a65f_0d6cf4d86be1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8ef7f51_157e_4b7c_95ce_7d8655c5c78d.slice/crio-conmon-464c41bcddd5358fa1a0c9a1917fd725eb23ed0f655757dc0c40512e0b95419f.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.170173 4975 generic.go:334] "Generic (PLEG): container finished" podID="136388c3-08f6-404b-9a43-d68687d762c8" containerID="6f727425465df55d2fac6f8f433eef3e30b04a52f9113ea6c12842151987a9bf" exitCode=0 Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.170248 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-645d765cf7-6vwpp" event={"ID":"136388c3-08f6-404b-9a43-d68687d762c8","Type":"ContainerDied","Data":"6f727425465df55d2fac6f8f433eef3e30b04a52f9113ea6c12842151987a9bf"} Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.171679 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b44bbdbf8-vkj8f" event={"ID":"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf","Type":"ContainerStarted","Data":"b257b6f334c363b0f4c3edabfbdc892256322d9444f8cf5d50da088d6c1f450e"} Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.173822 4975 generic.go:334] "Generic (PLEG): container finished" podID="06e8f21f-8218-4a64-a302-0f0b8193b9c8" containerID="77f2d72d2d7935b0002206b97a232de84e6d33a8a1d5a348b2bef72b0329527d" exitCode=137 Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.173849 4975 generic.go:334] "Generic (PLEG): container finished" podID="06e8f21f-8218-4a64-a302-0f0b8193b9c8" containerID="36b15be87c7997046d29b661b13f902f84ae0523ac0c792ceb21949c58be4356" exitCode=137 Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.173886 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f78496f49-fklmw" event={"ID":"06e8f21f-8218-4a64-a302-0f0b8193b9c8","Type":"ContainerDied","Data":"77f2d72d2d7935b0002206b97a232de84e6d33a8a1d5a348b2bef72b0329527d"} Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.174320 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f78496f49-fklmw" event={"ID":"06e8f21f-8218-4a64-a302-0f0b8193b9c8","Type":"ContainerDied","Data":"36b15be87c7997046d29b661b13f902f84ae0523ac0c792ceb21949c58be4356"} Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.752734 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.848511 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06e8f21f-8218-4a64-a302-0f0b8193b9c8-config-data\") pod \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.848609 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kx7z\" (UniqueName: \"kubernetes.io/projected/06e8f21f-8218-4a64-a302-0f0b8193b9c8-kube-api-access-8kx7z\") pod \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.848684 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/06e8f21f-8218-4a64-a302-0f0b8193b9c8-horizon-secret-key\") pod \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.848712 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06e8f21f-8218-4a64-a302-0f0b8193b9c8-logs\") pod \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.848768 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06e8f21f-8218-4a64-a302-0f0b8193b9c8-scripts\") pod \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\" (UID: \"06e8f21f-8218-4a64-a302-0f0b8193b9c8\") " Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.851083 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06e8f21f-8218-4a64-a302-0f0b8193b9c8-logs" (OuterVolumeSpecName: "logs") pod "06e8f21f-8218-4a64-a302-0f0b8193b9c8" (UID: "06e8f21f-8218-4a64-a302-0f0b8193b9c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.861721 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e8f21f-8218-4a64-a302-0f0b8193b9c8-kube-api-access-8kx7z" (OuterVolumeSpecName: "kube-api-access-8kx7z") pod "06e8f21f-8218-4a64-a302-0f0b8193b9c8" (UID: "06e8f21f-8218-4a64-a302-0f0b8193b9c8"). InnerVolumeSpecName "kube-api-access-8kx7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.872651 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e8f21f-8218-4a64-a302-0f0b8193b9c8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "06e8f21f-8218-4a64-a302-0f0b8193b9c8" (UID: "06e8f21f-8218-4a64-a302-0f0b8193b9c8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.874502 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06e8f21f-8218-4a64-a302-0f0b8193b9c8-scripts" (OuterVolumeSpecName: "scripts") pod "06e8f21f-8218-4a64-a302-0f0b8193b9c8" (UID: "06e8f21f-8218-4a64-a302-0f0b8193b9c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.892215 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06e8f21f-8218-4a64-a302-0f0b8193b9c8-config-data" (OuterVolumeSpecName: "config-data") pod "06e8f21f-8218-4a64-a302-0f0b8193b9c8" (UID: "06e8f21f-8218-4a64-a302-0f0b8193b9c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.951541 4975 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/06e8f21f-8218-4a64-a302-0f0b8193b9c8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.951583 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06e8f21f-8218-4a64-a302-0f0b8193b9c8-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.951595 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06e8f21f-8218-4a64-a302-0f0b8193b9c8-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.951606 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06e8f21f-8218-4a64-a302-0f0b8193b9c8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.951617 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kx7z\" (UniqueName: \"kubernetes.io/projected/06e8f21f-8218-4a64-a302-0f0b8193b9c8-kube-api-access-8kx7z\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:40 crc kubenswrapper[4975]: I0318 12:34:40.994883 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67fdc8889-2cm4h"] Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.068092 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-645d765cf7-6vwpp" podUID="136388c3-08f6-404b-9a43-d68687d762c8" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.161:9696/\": dial tcp 10.217.0.161:9696: connect: connection refused" Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.185128 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67fdc8889-2cm4h" event={"ID":"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0","Type":"ContainerStarted","Data":"17404a7b2d53a55bfa8f8c2c858c0ffc20ac01c2816d58731d812d3a7be9d450"} Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.187376 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b44bbdbf8-vkj8f" event={"ID":"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf","Type":"ContainerStarted","Data":"1ea67dda183f998e0bb928b0a527550c94ce2f91e32b343e462fc2fc88e15ba7"} Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.187432 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b44bbdbf8-vkj8f" event={"ID":"ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf","Type":"ContainerStarted","Data":"a413edf12cf445d32d8f6869ed4cc338427c392d80e4f9e176283607a77fd559"} Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.187455 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.187474 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.190672 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d","Type":"ContainerStarted","Data":"d676e583a72c57ce4890b890edfb4a46c6d795d8667e9a5c1cefaa2034825a63"} Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.191611 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.194667 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f78496f49-fklmw" event={"ID":"06e8f21f-8218-4a64-a302-0f0b8193b9c8","Type":"ContainerDied","Data":"9f040c1062540180477332f0161a1cd77e0c073ae99c49ea94444d7fd81783c0"} Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.194717 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f78496f49-fklmw" Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.194723 4975 scope.go:117] "RemoveContainer" containerID="77f2d72d2d7935b0002206b97a232de84e6d33a8a1d5a348b2bef72b0329527d" Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.209073 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b44bbdbf8-vkj8f" podStartSLOduration=3.209053821 podStartE2EDuration="3.209053821s" podCreationTimestamp="2026-03-18 12:34:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:41.206050328 +0000 UTC m=+1466.920450917" watchObservedRunningTime="2026-03-18 12:34:41.209053821 +0000 UTC m=+1466.923454400" Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.245803 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f78496f49-fklmw"] Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.278313 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.405635712 podStartE2EDuration="8.278292828s" podCreationTimestamp="2026-03-18 12:34:33 +0000 UTC" firstStartedPulling="2026-03-18 12:34:34.423571595 +0000 UTC m=+1460.137972174" lastFinishedPulling="2026-03-18 12:34:40.296228711 +0000 UTC m=+1466.010629290" observedRunningTime="2026-03-18 12:34:41.242554844 +0000 UTC m=+1466.956955423" watchObservedRunningTime="2026-03-18 12:34:41.278292828 +0000 UTC m=+1466.992693407" Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.278932 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-f78496f49-fklmw"] Mar 18 12:34:41 crc kubenswrapper[4975]: I0318 12:34:41.421778 4975 scope.go:117] "RemoveContainer" containerID="36b15be87c7997046d29b661b13f902f84ae0523ac0c792ceb21949c58be4356" Mar 18 12:34:42 crc kubenswrapper[4975]: I0318 12:34:42.137919 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 12:34:42 crc kubenswrapper[4975]: I0318 12:34:42.206691 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67fdc8889-2cm4h" event={"ID":"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0","Type":"ContainerStarted","Data":"a2c9a5800bdcb7a0cd5031a70371c7c728e2969a22af2d2669571b503cacf4e5"} Mar 18 12:34:42 crc kubenswrapper[4975]: I0318 12:34:42.206744 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67fdc8889-2cm4h" event={"ID":"b6ab8f83-7ff0-46e4-9045-611e4b3b97c0","Type":"ContainerStarted","Data":"1aff05a64641a4d3811c7f31ac9aecd1bd06f160ba9373e58e3ade2981d60c95"} Mar 18 12:34:42 crc kubenswrapper[4975]: I0318 12:34:42.207178 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:34:42 crc kubenswrapper[4975]: I0318 12:34:42.233245 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67fdc8889-2cm4h" podStartSLOduration=3.233222739 podStartE2EDuration="3.233222739s" podCreationTimestamp="2026-03-18 12:34:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:42.226380111 +0000 UTC m=+1467.940780690" watchObservedRunningTime="2026-03-18 12:34:42.233222739 +0000 UTC m=+1467.947623338" Mar 18 12:34:42 crc kubenswrapper[4975]: I0318 12:34:42.407766 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 12:34:42 crc kubenswrapper[4975]: I0318 12:34:42.424992 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:34:42 crc kubenswrapper[4975]: I0318 12:34:42.448308 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:34:42 crc kubenswrapper[4975]: I0318 12:34:42.503734 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h4v54"] Mar 18 12:34:42 crc kubenswrapper[4975]: I0318 12:34:42.513893 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-h4v54" podUID="0b6f390f-605a-4717-afea-39913c57679d" containerName="dnsmasq-dns" containerID="cri-o://8dfb7cd29dd11dd05cf0855ac3831bd69ec45aadf4dfe3423be66679885c4aa8" gracePeriod=10 Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.028812 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06e8f21f-8218-4a64-a302-0f0b8193b9c8" path="/var/lib/kubelet/pods/06e8f21f-8218-4a64-a302-0f0b8193b9c8/volumes" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.176187 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.199252 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-ovsdbserver-sb\") pod \"0b6f390f-605a-4717-afea-39913c57679d\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.199401 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kgfs\" (UniqueName: \"kubernetes.io/projected/0b6f390f-605a-4717-afea-39913c57679d-kube-api-access-7kgfs\") pod \"0b6f390f-605a-4717-afea-39913c57679d\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.199431 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-config\") pod \"0b6f390f-605a-4717-afea-39913c57679d\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.199509 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-dns-swift-storage-0\") pod \"0b6f390f-605a-4717-afea-39913c57679d\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.199552 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-dns-svc\") pod \"0b6f390f-605a-4717-afea-39913c57679d\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.199620 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-ovsdbserver-nb\") pod \"0b6f390f-605a-4717-afea-39913c57679d\" (UID: \"0b6f390f-605a-4717-afea-39913c57679d\") " Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.273817 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b6f390f-605a-4717-afea-39913c57679d-kube-api-access-7kgfs" (OuterVolumeSpecName: "kube-api-access-7kgfs") pod "0b6f390f-605a-4717-afea-39913c57679d" (UID: "0b6f390f-605a-4717-afea-39913c57679d"). InnerVolumeSpecName "kube-api-access-7kgfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.309247 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kgfs\" (UniqueName: \"kubernetes.io/projected/0b6f390f-605a-4717-afea-39913c57679d-kube-api-access-7kgfs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.340143 4975 generic.go:334] "Generic (PLEG): container finished" podID="0b6f390f-605a-4717-afea-39913c57679d" containerID="8dfb7cd29dd11dd05cf0855ac3831bd69ec45aadf4dfe3423be66679885c4aa8" exitCode=0 Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.341153 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-h4v54" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.341643 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-h4v54" event={"ID":"0b6f390f-605a-4717-afea-39913c57679d","Type":"ContainerDied","Data":"8dfb7cd29dd11dd05cf0855ac3831bd69ec45aadf4dfe3423be66679885c4aa8"} Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.341669 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-h4v54" event={"ID":"0b6f390f-605a-4717-afea-39913c57679d","Type":"ContainerDied","Data":"0859eea3706a9df1edfc322cfd13aa96f3d26ddca921445a959f6b577f14621f"} Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.341685 4975 scope.go:117] "RemoveContainer" containerID="8dfb7cd29dd11dd05cf0855ac3831bd69ec45aadf4dfe3423be66679885c4aa8" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.342358 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" containerName="cinder-scheduler" containerID="cri-o://642e2852d16bd31017d771c002487e405e4cae85dfc9dd86dd5a68b60b072411" gracePeriod=30 Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.342566 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" containerName="probe" containerID="cri-o://f8b378d86645ee66d48fd1b006bf435578d9a300ec9db8a1a7f64c30f51792f3" gracePeriod=30 Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.376644 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0b6f390f-605a-4717-afea-39913c57679d" (UID: "0b6f390f-605a-4717-afea-39913c57679d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.385647 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0b6f390f-605a-4717-afea-39913c57679d" (UID: "0b6f390f-605a-4717-afea-39913c57679d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.389534 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-config" (OuterVolumeSpecName: "config") pod "0b6f390f-605a-4717-afea-39913c57679d" (UID: "0b6f390f-605a-4717-afea-39913c57679d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.406610 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0b6f390f-605a-4717-afea-39913c57679d" (UID: "0b6f390f-605a-4717-afea-39913c57679d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.412487 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.412527 4975 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.412539 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.412552 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.421669 4975 scope.go:117] "RemoveContainer" containerID="945e825484de7f95654c5f7f2dd28c8c859c5ee1613c98acee7057aef6450c91" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.459884 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0b6f390f-605a-4717-afea-39913c57679d" (UID: "0b6f390f-605a-4717-afea-39913c57679d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.460053 4975 scope.go:117] "RemoveContainer" containerID="8dfb7cd29dd11dd05cf0855ac3831bd69ec45aadf4dfe3423be66679885c4aa8" Mar 18 12:34:43 crc kubenswrapper[4975]: E0318 12:34:43.466363 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dfb7cd29dd11dd05cf0855ac3831bd69ec45aadf4dfe3423be66679885c4aa8\": container with ID starting with 8dfb7cd29dd11dd05cf0855ac3831bd69ec45aadf4dfe3423be66679885c4aa8 not found: ID does not exist" containerID="8dfb7cd29dd11dd05cf0855ac3831bd69ec45aadf4dfe3423be66679885c4aa8" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.466521 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dfb7cd29dd11dd05cf0855ac3831bd69ec45aadf4dfe3423be66679885c4aa8"} err="failed to get container status \"8dfb7cd29dd11dd05cf0855ac3831bd69ec45aadf4dfe3423be66679885c4aa8\": rpc error: code = NotFound desc = could not find container \"8dfb7cd29dd11dd05cf0855ac3831bd69ec45aadf4dfe3423be66679885c4aa8\": container with ID starting with 8dfb7cd29dd11dd05cf0855ac3831bd69ec45aadf4dfe3423be66679885c4aa8 not found: ID does not exist" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.466684 4975 scope.go:117] "RemoveContainer" containerID="945e825484de7f95654c5f7f2dd28c8c859c5ee1613c98acee7057aef6450c91" Mar 18 12:34:43 crc kubenswrapper[4975]: E0318 12:34:43.467263 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"945e825484de7f95654c5f7f2dd28c8c859c5ee1613c98acee7057aef6450c91\": container with ID starting with 945e825484de7f95654c5f7f2dd28c8c859c5ee1613c98acee7057aef6450c91 not found: ID does not exist" containerID="945e825484de7f95654c5f7f2dd28c8c859c5ee1613c98acee7057aef6450c91" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.467369 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"945e825484de7f95654c5f7f2dd28c8c859c5ee1613c98acee7057aef6450c91"} err="failed to get container status \"945e825484de7f95654c5f7f2dd28c8c859c5ee1613c98acee7057aef6450c91\": rpc error: code = NotFound desc = could not find container \"945e825484de7f95654c5f7f2dd28c8c859c5ee1613c98acee7057aef6450c91\": container with ID starting with 945e825484de7f95654c5f7f2dd28c8c859c5ee1613c98acee7057aef6450c91 not found: ID does not exist" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.518009 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b6f390f-605a-4717-afea-39913c57679d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.590982 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.679719 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h4v54"] Mar 18 12:34:43 crc kubenswrapper[4975]: I0318 12:34:43.689467 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h4v54"] Mar 18 12:34:44 crc kubenswrapper[4975]: I0318 12:34:44.087426 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:44 crc kubenswrapper[4975]: I0318 12:34:44.353849 4975 generic.go:334] "Generic (PLEG): container finished" podID="24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" containerID="f8b378d86645ee66d48fd1b006bf435578d9a300ec9db8a1a7f64c30f51792f3" exitCode=0 Mar 18 12:34:44 crc kubenswrapper[4975]: I0318 12:34:44.353896 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd","Type":"ContainerDied","Data":"f8b378d86645ee66d48fd1b006bf435578d9a300ec9db8a1a7f64c30f51792f3"} Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.027806 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b6f390f-605a-4717-afea-39913c57679d" path="/var/lib/kubelet/pods/0b6f390f-605a-4717-afea-39913c57679d/volumes" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.334714 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.421604 4975 generic.go:334] "Generic (PLEG): container finished" podID="24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" containerID="642e2852d16bd31017d771c002487e405e4cae85dfc9dd86dd5a68b60b072411" exitCode=0 Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.421651 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd","Type":"ContainerDied","Data":"642e2852d16bd31017d771c002487e405e4cae85dfc9dd86dd5a68b60b072411"} Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.421678 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd","Type":"ContainerDied","Data":"0c61d8a7b950372a71dca37bef9dd12650e3a4beaea0431763220f75404b21b7"} Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.421733 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c61d8a7b950372a71dca37bef9dd12650e3a4beaea0431763220f75404b21b7" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.574605 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.666431 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-scripts\") pod \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.666564 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-combined-ca-bundle\") pod \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.666600 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv9vv\" (UniqueName: \"kubernetes.io/projected/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-kube-api-access-vv9vv\") pod \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.666626 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-config-data-custom\") pod \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.666650 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-etc-machine-id\") pod \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.666725 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-config-data\") pod \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\" (UID: \"24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd\") " Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.669226 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" (UID: "24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.673827 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" (UID: "24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.674300 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-kube-api-access-vv9vv" (OuterVolumeSpecName: "kube-api-access-vv9vv") pod "24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" (UID: "24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd"). InnerVolumeSpecName "kube-api-access-vv9vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.676734 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-scripts" (OuterVolumeSpecName: "scripts") pod "24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" (UID: "24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.735142 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" (UID: "24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.770253 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.770303 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv9vv\" (UniqueName: \"kubernetes.io/projected/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-kube-api-access-vv9vv\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.770321 4975 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.770362 4975 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.770374 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.808060 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-config-data" (OuterVolumeSpecName: "config-data") pod "24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" (UID: "24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.871743 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.950582 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.972895 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-internal-tls-certs\") pod \"136388c3-08f6-404b-9a43-d68687d762c8\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.972990 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-public-tls-certs\") pod \"136388c3-08f6-404b-9a43-d68687d762c8\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.973113 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb7bf\" (UniqueName: \"kubernetes.io/projected/136388c3-08f6-404b-9a43-d68687d762c8-kube-api-access-bb7bf\") pod \"136388c3-08f6-404b-9a43-d68687d762c8\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.973147 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-combined-ca-bundle\") pod \"136388c3-08f6-404b-9a43-d68687d762c8\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.973171 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-config\") pod \"136388c3-08f6-404b-9a43-d68687d762c8\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.973201 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-ovndb-tls-certs\") pod \"136388c3-08f6-404b-9a43-d68687d762c8\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.973345 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-httpd-config\") pod \"136388c3-08f6-404b-9a43-d68687d762c8\" (UID: \"136388c3-08f6-404b-9a43-d68687d762c8\") " Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.978939 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "136388c3-08f6-404b-9a43-d68687d762c8" (UID: "136388c3-08f6-404b-9a43-d68687d762c8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:45 crc kubenswrapper[4975]: I0318 12:34:45.994074 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/136388c3-08f6-404b-9a43-d68687d762c8-kube-api-access-bb7bf" (OuterVolumeSpecName: "kube-api-access-bb7bf") pod "136388c3-08f6-404b-9a43-d68687d762c8" (UID: "136388c3-08f6-404b-9a43-d68687d762c8"). InnerVolumeSpecName "kube-api-access-bb7bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.031205 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-config" (OuterVolumeSpecName: "config") pod "136388c3-08f6-404b-9a43-d68687d762c8" (UID: "136388c3-08f6-404b-9a43-d68687d762c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.036815 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "136388c3-08f6-404b-9a43-d68687d762c8" (UID: "136388c3-08f6-404b-9a43-d68687d762c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.039888 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "136388c3-08f6-404b-9a43-d68687d762c8" (UID: "136388c3-08f6-404b-9a43-d68687d762c8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.040216 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "136388c3-08f6-404b-9a43-d68687d762c8" (UID: "136388c3-08f6-404b-9a43-d68687d762c8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.069676 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "136388c3-08f6-404b-9a43-d68687d762c8" (UID: "136388c3-08f6-404b-9a43-d68687d762c8"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.076134 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb7bf\" (UniqueName: \"kubernetes.io/projected/136388c3-08f6-404b-9a43-d68687d762c8-kube-api-access-bb7bf\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.076181 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.076190 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.076201 4975 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.076209 4975 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.076218 4975 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.076227 4975 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/136388c3-08f6-404b-9a43-d68687d762c8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.431922 4975 generic.go:334] "Generic (PLEG): container finished" podID="136388c3-08f6-404b-9a43-d68687d762c8" containerID="0b002b0ac8c9c3a8c4697ac79ac60aad1dddf8d8c6ba6033e81207d9607b7e11" exitCode=0 Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.432008 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-645d765cf7-6vwpp" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.432028 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-645d765cf7-6vwpp" event={"ID":"136388c3-08f6-404b-9a43-d68687d762c8","Type":"ContainerDied","Data":"0b002b0ac8c9c3a8c4697ac79ac60aad1dddf8d8c6ba6033e81207d9607b7e11"} Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.432879 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-645d765cf7-6vwpp" event={"ID":"136388c3-08f6-404b-9a43-d68687d762c8","Type":"ContainerDied","Data":"4209b872a372909b1960ee78ff42c2c22dbe23136ba4c6a905c11083485b648e"} Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.432904 4975 scope.go:117] "RemoveContainer" containerID="6f727425465df55d2fac6f8f433eef3e30b04a52f9113ea6c12842151987a9bf" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.450681 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.455014 4975 scope.go:117] "RemoveContainer" containerID="0b002b0ac8c9c3a8c4697ac79ac60aad1dddf8d8c6ba6033e81207d9607b7e11" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.506931 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-645d765cf7-6vwpp"] Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.511028 4975 scope.go:117] "RemoveContainer" containerID="6f727425465df55d2fac6f8f433eef3e30b04a52f9113ea6c12842151987a9bf" Mar 18 12:34:46 crc kubenswrapper[4975]: E0318 12:34:46.515024 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f727425465df55d2fac6f8f433eef3e30b04a52f9113ea6c12842151987a9bf\": container with ID starting with 6f727425465df55d2fac6f8f433eef3e30b04a52f9113ea6c12842151987a9bf not found: ID does not exist" containerID="6f727425465df55d2fac6f8f433eef3e30b04a52f9113ea6c12842151987a9bf" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.515093 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f727425465df55d2fac6f8f433eef3e30b04a52f9113ea6c12842151987a9bf"} err="failed to get container status \"6f727425465df55d2fac6f8f433eef3e30b04a52f9113ea6c12842151987a9bf\": rpc error: code = NotFound desc = could not find container \"6f727425465df55d2fac6f8f433eef3e30b04a52f9113ea6c12842151987a9bf\": container with ID starting with 6f727425465df55d2fac6f8f433eef3e30b04a52f9113ea6c12842151987a9bf not found: ID does not exist" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.515129 4975 scope.go:117] "RemoveContainer" containerID="0b002b0ac8c9c3a8c4697ac79ac60aad1dddf8d8c6ba6033e81207d9607b7e11" Mar 18 12:34:46 crc kubenswrapper[4975]: E0318 12:34:46.517160 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b002b0ac8c9c3a8c4697ac79ac60aad1dddf8d8c6ba6033e81207d9607b7e11\": container with ID starting with 0b002b0ac8c9c3a8c4697ac79ac60aad1dddf8d8c6ba6033e81207d9607b7e11 not found: ID does not exist" containerID="0b002b0ac8c9c3a8c4697ac79ac60aad1dddf8d8c6ba6033e81207d9607b7e11" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.517227 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b002b0ac8c9c3a8c4697ac79ac60aad1dddf8d8c6ba6033e81207d9607b7e11"} err="failed to get container status \"0b002b0ac8c9c3a8c4697ac79ac60aad1dddf8d8c6ba6033e81207d9607b7e11\": rpc error: code = NotFound desc = could not find container \"0b002b0ac8c9c3a8c4697ac79ac60aad1dddf8d8c6ba6033e81207d9607b7e11\": container with ID starting with 0b002b0ac8c9c3a8c4697ac79ac60aad1dddf8d8c6ba6033e81207d9607b7e11 not found: ID does not exist" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.523750 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-645d765cf7-6vwpp"] Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.536028 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.543738 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.553690 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:34:46 crc kubenswrapper[4975]: E0318 12:34:46.554108 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b6f390f-605a-4717-afea-39913c57679d" containerName="dnsmasq-dns" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.554127 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b6f390f-605a-4717-afea-39913c57679d" containerName="dnsmasq-dns" Mar 18 12:34:46 crc kubenswrapper[4975]: E0318 12:34:46.554149 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="136388c3-08f6-404b-9a43-d68687d762c8" containerName="neutron-api" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.554160 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="136388c3-08f6-404b-9a43-d68687d762c8" containerName="neutron-api" Mar 18 12:34:46 crc kubenswrapper[4975]: E0318 12:34:46.554172 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" containerName="cinder-scheduler" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.554180 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" containerName="cinder-scheduler" Mar 18 12:34:46 crc kubenswrapper[4975]: E0318 12:34:46.554192 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06e8f21f-8218-4a64-a302-0f0b8193b9c8" containerName="horizon-log" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.554199 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e8f21f-8218-4a64-a302-0f0b8193b9c8" containerName="horizon-log" Mar 18 12:34:46 crc kubenswrapper[4975]: E0318 12:34:46.554218 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="136388c3-08f6-404b-9a43-d68687d762c8" containerName="neutron-httpd" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.554225 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="136388c3-08f6-404b-9a43-d68687d762c8" containerName="neutron-httpd" Mar 18 12:34:46 crc kubenswrapper[4975]: E0318 12:34:46.554238 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" containerName="probe" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.554247 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" containerName="probe" Mar 18 12:34:46 crc kubenswrapper[4975]: E0318 12:34:46.554259 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06e8f21f-8218-4a64-a302-0f0b8193b9c8" containerName="horizon" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.554265 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e8f21f-8218-4a64-a302-0f0b8193b9c8" containerName="horizon" Mar 18 12:34:46 crc kubenswrapper[4975]: E0318 12:34:46.554295 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b6f390f-605a-4717-afea-39913c57679d" containerName="init" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.554302 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b6f390f-605a-4717-afea-39913c57679d" containerName="init" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.554515 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" containerName="probe" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.554528 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="06e8f21f-8218-4a64-a302-0f0b8193b9c8" containerName="horizon-log" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.554555 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="136388c3-08f6-404b-9a43-d68687d762c8" containerName="neutron-api" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.554567 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="136388c3-08f6-404b-9a43-d68687d762c8" containerName="neutron-httpd" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.554583 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" containerName="cinder-scheduler" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.554600 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="06e8f21f-8218-4a64-a302-0f0b8193b9c8" containerName="horizon" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.554613 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b6f390f-605a-4717-afea-39913c57679d" containerName="dnsmasq-dns" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.555738 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.558621 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.562557 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.586214 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d63c8853-95df-41fd-99e4-ff384da2ef6f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.586266 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63c8853-95df-41fd-99e4-ff384da2ef6f-config-data\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.586316 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d63c8853-95df-41fd-99e4-ff384da2ef6f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.586387 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tbzv\" (UniqueName: \"kubernetes.io/projected/d63c8853-95df-41fd-99e4-ff384da2ef6f-kube-api-access-7tbzv\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.586416 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63c8853-95df-41fd-99e4-ff384da2ef6f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.586639 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d63c8853-95df-41fd-99e4-ff384da2ef6f-scripts\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.688735 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tbzv\" (UniqueName: \"kubernetes.io/projected/d63c8853-95df-41fd-99e4-ff384da2ef6f-kube-api-access-7tbzv\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.688805 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63c8853-95df-41fd-99e4-ff384da2ef6f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.688904 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d63c8853-95df-41fd-99e4-ff384da2ef6f-scripts\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.688971 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d63c8853-95df-41fd-99e4-ff384da2ef6f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.689001 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63c8853-95df-41fd-99e4-ff384da2ef6f-config-data\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.689040 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d63c8853-95df-41fd-99e4-ff384da2ef6f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.689064 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d63c8853-95df-41fd-99e4-ff384da2ef6f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.693025 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63c8853-95df-41fd-99e4-ff384da2ef6f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.693202 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d63c8853-95df-41fd-99e4-ff384da2ef6f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.693403 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d63c8853-95df-41fd-99e4-ff384da2ef6f-scripts\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.695111 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63c8853-95df-41fd-99e4-ff384da2ef6f-config-data\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.707393 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tbzv\" (UniqueName: \"kubernetes.io/projected/d63c8853-95df-41fd-99e4-ff384da2ef6f-kube-api-access-7tbzv\") pod \"cinder-scheduler-0\" (UID: \"d63c8853-95df-41fd-99e4-ff384da2ef6f\") " pod="openstack/cinder-scheduler-0" Mar 18 12:34:46 crc kubenswrapper[4975]: I0318 12:34:46.880665 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 12:34:47 crc kubenswrapper[4975]: I0318 12:34:47.031327 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="136388c3-08f6-404b-9a43-d68687d762c8" path="/var/lib/kubelet/pods/136388c3-08f6-404b-9a43-d68687d762c8/volumes" Mar 18 12:34:47 crc kubenswrapper[4975]: I0318 12:34:47.031905 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd" path="/var/lib/kubelet/pods/24efd9c2-0e78-48b1-a3cf-f0ddfc691fbd/volumes" Mar 18 12:34:47 crc kubenswrapper[4975]: I0318 12:34:47.372332 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 12:34:47 crc kubenswrapper[4975]: I0318 12:34:47.457669 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d63c8853-95df-41fd-99e4-ff384da2ef6f","Type":"ContainerStarted","Data":"f0fb6c0768aee7427124482aa9727b4e1214832dc911cce624f2343c91572be2"} Mar 18 12:34:48 crc kubenswrapper[4975]: I0318 12:34:48.481076 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d63c8853-95df-41fd-99e4-ff384da2ef6f","Type":"ContainerStarted","Data":"0bdfc3fce28e3b7d04e66e48e513400860d7c04bc99a29a97124d5aa63149e1a"} Mar 18 12:34:48 crc kubenswrapper[4975]: I0318 12:34:48.891711 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.279483 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.337668 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.490963 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d63c8853-95df-41fd-99e4-ff384da2ef6f","Type":"ContainerStarted","Data":"ddde94e70c50741b1a5329d2bed7ad137fd6f7afc8a38dca435ab6b85c73d224"} Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.510781 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.510758588 podStartE2EDuration="3.510758588s" podCreationTimestamp="2026-03-18 12:34:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:49.510234264 +0000 UTC m=+1475.224634843" watchObservedRunningTime="2026-03-18 12:34:49.510758588 +0000 UTC m=+1475.225159167" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.582557 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-55d747f656-4fh7c"] Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.585434 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.628394 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55d747f656-4fh7c"] Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.644980 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc4jf\" (UniqueName: \"kubernetes.io/projected/2615ee1a-a138-4ace-88dd-bda440399db9-kube-api-access-jc4jf\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.645299 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2615ee1a-a138-4ace-88dd-bda440399db9-scripts\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.645410 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2615ee1a-a138-4ace-88dd-bda440399db9-logs\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.645497 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2615ee1a-a138-4ace-88dd-bda440399db9-internal-tls-certs\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.650007 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2615ee1a-a138-4ace-88dd-bda440399db9-combined-ca-bundle\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.650290 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2615ee1a-a138-4ace-88dd-bda440399db9-config-data\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.650425 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2615ee1a-a138-4ace-88dd-bda440399db9-public-tls-certs\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.767442 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2615ee1a-a138-4ace-88dd-bda440399db9-public-tls-certs\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.768095 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc4jf\" (UniqueName: \"kubernetes.io/projected/2615ee1a-a138-4ace-88dd-bda440399db9-kube-api-access-jc4jf\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.768295 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2615ee1a-a138-4ace-88dd-bda440399db9-scripts\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.768450 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2615ee1a-a138-4ace-88dd-bda440399db9-logs\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.768593 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2615ee1a-a138-4ace-88dd-bda440399db9-internal-tls-certs\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.768727 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2615ee1a-a138-4ace-88dd-bda440399db9-combined-ca-bundle\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.768841 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2615ee1a-a138-4ace-88dd-bda440399db9-config-data\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.778392 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2615ee1a-a138-4ace-88dd-bda440399db9-logs\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.783607 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2615ee1a-a138-4ace-88dd-bda440399db9-combined-ca-bundle\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.800298 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2615ee1a-a138-4ace-88dd-bda440399db9-scripts\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.800768 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2615ee1a-a138-4ace-88dd-bda440399db9-config-data\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.801305 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc4jf\" (UniqueName: \"kubernetes.io/projected/2615ee1a-a138-4ace-88dd-bda440399db9-kube-api-access-jc4jf\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.804392 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2615ee1a-a138-4ace-88dd-bda440399db9-public-tls-certs\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.814502 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2615ee1a-a138-4ace-88dd-bda440399db9-internal-tls-certs\") pod \"placement-55d747f656-4fh7c\" (UID: \"2615ee1a-a138-4ace-88dd-bda440399db9\") " pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.873928 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:34:49 crc kubenswrapper[4975]: I0318 12:34:49.903411 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:50 crc kubenswrapper[4975]: I0318 12:34:50.536371 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55d747f656-4fh7c"] Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.220924 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.268713 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b44bbdbf8-vkj8f" Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.335173 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6dbdc7b4b-htzr4"] Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.335665 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6dbdc7b4b-htzr4" podUID="5bae225c-6f49-427e-b276-8b0ffdff525f" containerName="barbican-api-log" containerID="cri-o://478fa8c0f210ab3779ca88d517690908ac3aec64b8c063aa7f96a342a467df93" gracePeriod=30 Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.336116 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6dbdc7b4b-htzr4" podUID="5bae225c-6f49-427e-b276-8b0ffdff525f" containerName="barbican-api" containerID="cri-o://990b71650fdbf831a485b7015b36af538d557ca29dfc832424d0a94919fe9601" gracePeriod=30 Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.514163 4975 generic.go:334] "Generic (PLEG): container finished" podID="5bae225c-6f49-427e-b276-8b0ffdff525f" containerID="478fa8c0f210ab3779ca88d517690908ac3aec64b8c063aa7f96a342a467df93" exitCode=143 Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.514236 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dbdc7b4b-htzr4" event={"ID":"5bae225c-6f49-427e-b276-8b0ffdff525f","Type":"ContainerDied","Data":"478fa8c0f210ab3779ca88d517690908ac3aec64b8c063aa7f96a342a467df93"} Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.522357 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55d747f656-4fh7c" event={"ID":"2615ee1a-a138-4ace-88dd-bda440399db9","Type":"ContainerStarted","Data":"45e5c0df0fcfa284612e3159f559ee8c6484732e97de62598c01450d1f54d7ec"} Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.522419 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55d747f656-4fh7c" event={"ID":"2615ee1a-a138-4ace-88dd-bda440399db9","Type":"ContainerStarted","Data":"518db8588c9a6fe5375d54f460dbc8f6adbafc6313dc29ca8b41872a01e35d84"} Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.522431 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55d747f656-4fh7c" event={"ID":"2615ee1a-a138-4ace-88dd-bda440399db9","Type":"ContainerStarted","Data":"31749ca3889ffec4f9ccd80ae71c69f211080c89179a484ce659475d79ff1521"} Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.562149 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-55d747f656-4fh7c" podStartSLOduration=2.562128867 podStartE2EDuration="2.562128867s" podCreationTimestamp="2026-03-18 12:34:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:34:51.544048159 +0000 UTC m=+1477.258448748" watchObservedRunningTime="2026-03-18 12:34:51.562128867 +0000 UTC m=+1477.276529446" Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.687819 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-56bcd48494-744wv" Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.786476 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c478d4794-x2t7q"] Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.786698 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c478d4794-x2t7q" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerName="horizon-log" containerID="cri-o://6b28ba84cab9eb6dd833f8a2c43c58ed4371a9c34e96c677b896512ab2e431d3" gracePeriod=30 Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.786838 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c478d4794-x2t7q" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerName="horizon" containerID="cri-o://085c3bcf8fe1108084d797f0f69756fc886932b4a0540f844652adcd1deb2117" gracePeriod=30 Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.810522 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-c478d4794-x2t7q" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Mar 18 12:34:51 crc kubenswrapper[4975]: I0318 12:34:51.881649 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 12:34:52 crc kubenswrapper[4975]: I0318 12:34:52.531693 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:52 crc kubenswrapper[4975]: I0318 12:34:52.532041 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:34:54 crc kubenswrapper[4975]: I0318 12:34:54.447425 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-f8786c66f-vjkt4" Mar 18 12:34:54 crc kubenswrapper[4975]: I0318 12:34:54.773044 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dbdc7b4b-htzr4" podUID="5bae225c-6f49-427e-b276-8b0ffdff525f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:46484->10.217.0.168:9311: read: connection reset by peer" Mar 18 12:34:54 crc kubenswrapper[4975]: I0318 12:34:54.773094 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dbdc7b4b-htzr4" podUID="5bae225c-6f49-427e-b276-8b0ffdff525f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:46480->10.217.0.168:9311: read: connection reset by peer" Mar 18 12:34:54 crc kubenswrapper[4975]: I0318 12:34:54.989901 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 12:34:54 crc kubenswrapper[4975]: I0318 12:34:54.991193 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 12:34:54 crc kubenswrapper[4975]: I0318 12:34:54.993263 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-68dsp" Mar 18 12:34:54 crc kubenswrapper[4975]: I0318 12:34:54.993473 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 12:34:54 crc kubenswrapper[4975]: I0318 12:34:54.993610 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 12:34:54 crc kubenswrapper[4975]: I0318 12:34:54.999379 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.021228 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fsrb\" (UniqueName: \"kubernetes.io/projected/eedc82f6-6487-41cf-b618-db58be6f1eed-kube-api-access-8fsrb\") pod \"openstackclient\" (UID: \"eedc82f6-6487-41cf-b618-db58be6f1eed\") " pod="openstack/openstackclient" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.021287 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eedc82f6-6487-41cf-b618-db58be6f1eed-openstack-config-secret\") pod \"openstackclient\" (UID: \"eedc82f6-6487-41cf-b618-db58be6f1eed\") " pod="openstack/openstackclient" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.021315 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eedc82f6-6487-41cf-b618-db58be6f1eed-openstack-config\") pod \"openstackclient\" (UID: \"eedc82f6-6487-41cf-b618-db58be6f1eed\") " pod="openstack/openstackclient" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.021336 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedc82f6-6487-41cf-b618-db58be6f1eed-combined-ca-bundle\") pod \"openstackclient\" (UID: \"eedc82f6-6487-41cf-b618-db58be6f1eed\") " pod="openstack/openstackclient" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.140130 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fsrb\" (UniqueName: \"kubernetes.io/projected/eedc82f6-6487-41cf-b618-db58be6f1eed-kube-api-access-8fsrb\") pod \"openstackclient\" (UID: \"eedc82f6-6487-41cf-b618-db58be6f1eed\") " pod="openstack/openstackclient" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.140201 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eedc82f6-6487-41cf-b618-db58be6f1eed-openstack-config-secret\") pod \"openstackclient\" (UID: \"eedc82f6-6487-41cf-b618-db58be6f1eed\") " pod="openstack/openstackclient" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.140235 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eedc82f6-6487-41cf-b618-db58be6f1eed-openstack-config\") pod \"openstackclient\" (UID: \"eedc82f6-6487-41cf-b618-db58be6f1eed\") " pod="openstack/openstackclient" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.140266 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedc82f6-6487-41cf-b618-db58be6f1eed-combined-ca-bundle\") pod \"openstackclient\" (UID: \"eedc82f6-6487-41cf-b618-db58be6f1eed\") " pod="openstack/openstackclient" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.142112 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eedc82f6-6487-41cf-b618-db58be6f1eed-openstack-config\") pod \"openstackclient\" (UID: \"eedc82f6-6487-41cf-b618-db58be6f1eed\") " pod="openstack/openstackclient" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.159692 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eedc82f6-6487-41cf-b618-db58be6f1eed-openstack-config-secret\") pod \"openstackclient\" (UID: \"eedc82f6-6487-41cf-b618-db58be6f1eed\") " pod="openstack/openstackclient" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.164358 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedc82f6-6487-41cf-b618-db58be6f1eed-combined-ca-bundle\") pod \"openstackclient\" (UID: \"eedc82f6-6487-41cf-b618-db58be6f1eed\") " pod="openstack/openstackclient" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.188593 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fsrb\" (UniqueName: \"kubernetes.io/projected/eedc82f6-6487-41cf-b618-db58be6f1eed-kube-api-access-8fsrb\") pod \"openstackclient\" (UID: \"eedc82f6-6487-41cf-b618-db58be6f1eed\") " pod="openstack/openstackclient" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.312201 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.403079 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.448409 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-config-data\") pod \"5bae225c-6f49-427e-b276-8b0ffdff525f\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.448460 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-combined-ca-bundle\") pod \"5bae225c-6f49-427e-b276-8b0ffdff525f\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.448504 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-config-data-custom\") pod \"5bae225c-6f49-427e-b276-8b0ffdff525f\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.448557 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdj4g\" (UniqueName: \"kubernetes.io/projected/5bae225c-6f49-427e-b276-8b0ffdff525f-kube-api-access-wdj4g\") pod \"5bae225c-6f49-427e-b276-8b0ffdff525f\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.448577 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bae225c-6f49-427e-b276-8b0ffdff525f-logs\") pod \"5bae225c-6f49-427e-b276-8b0ffdff525f\" (UID: \"5bae225c-6f49-427e-b276-8b0ffdff525f\") " Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.449721 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bae225c-6f49-427e-b276-8b0ffdff525f-logs" (OuterVolumeSpecName: "logs") pod "5bae225c-6f49-427e-b276-8b0ffdff525f" (UID: "5bae225c-6f49-427e-b276-8b0ffdff525f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.463262 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5bae225c-6f49-427e-b276-8b0ffdff525f" (UID: "5bae225c-6f49-427e-b276-8b0ffdff525f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.464706 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bae225c-6f49-427e-b276-8b0ffdff525f-kube-api-access-wdj4g" (OuterVolumeSpecName: "kube-api-access-wdj4g") pod "5bae225c-6f49-427e-b276-8b0ffdff525f" (UID: "5bae225c-6f49-427e-b276-8b0ffdff525f"). InnerVolumeSpecName "kube-api-access-wdj4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.505985 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bae225c-6f49-427e-b276-8b0ffdff525f" (UID: "5bae225c-6f49-427e-b276-8b0ffdff525f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.506438 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-config-data" (OuterVolumeSpecName: "config-data") pod "5bae225c-6f49-427e-b276-8b0ffdff525f" (UID: "5bae225c-6f49-427e-b276-8b0ffdff525f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.550427 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.550458 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.550468 4975 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bae225c-6f49-427e-b276-8b0ffdff525f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.550477 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdj4g\" (UniqueName: \"kubernetes.io/projected/5bae225c-6f49-427e-b276-8b0ffdff525f-kube-api-access-wdj4g\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.550488 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bae225c-6f49-427e-b276-8b0ffdff525f-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.587621 4975 generic.go:334] "Generic (PLEG): container finished" podID="5bae225c-6f49-427e-b276-8b0ffdff525f" containerID="990b71650fdbf831a485b7015b36af538d557ca29dfc832424d0a94919fe9601" exitCode=0 Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.587704 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dbdc7b4b-htzr4" event={"ID":"5bae225c-6f49-427e-b276-8b0ffdff525f","Type":"ContainerDied","Data":"990b71650fdbf831a485b7015b36af538d557ca29dfc832424d0a94919fe9601"} Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.587729 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dbdc7b4b-htzr4" event={"ID":"5bae225c-6f49-427e-b276-8b0ffdff525f","Type":"ContainerDied","Data":"f4e4efe5b629031fa517717abf21698944718c9d78185e72d9061e46578d7076"} Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.587745 4975 scope.go:117] "RemoveContainer" containerID="990b71650fdbf831a485b7015b36af538d557ca29dfc832424d0a94919fe9601" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.587920 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dbdc7b4b-htzr4" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.621810 4975 scope.go:117] "RemoveContainer" containerID="478fa8c0f210ab3779ca88d517690908ac3aec64b8c063aa7f96a342a467df93" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.625134 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6dbdc7b4b-htzr4"] Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.637334 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6dbdc7b4b-htzr4"] Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.646631 4975 scope.go:117] "RemoveContainer" containerID="990b71650fdbf831a485b7015b36af538d557ca29dfc832424d0a94919fe9601" Mar 18 12:34:55 crc kubenswrapper[4975]: E0318 12:34:55.647141 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"990b71650fdbf831a485b7015b36af538d557ca29dfc832424d0a94919fe9601\": container with ID starting with 990b71650fdbf831a485b7015b36af538d557ca29dfc832424d0a94919fe9601 not found: ID does not exist" containerID="990b71650fdbf831a485b7015b36af538d557ca29dfc832424d0a94919fe9601" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.647187 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"990b71650fdbf831a485b7015b36af538d557ca29dfc832424d0a94919fe9601"} err="failed to get container status \"990b71650fdbf831a485b7015b36af538d557ca29dfc832424d0a94919fe9601\": rpc error: code = NotFound desc = could not find container \"990b71650fdbf831a485b7015b36af538d557ca29dfc832424d0a94919fe9601\": container with ID starting with 990b71650fdbf831a485b7015b36af538d557ca29dfc832424d0a94919fe9601 not found: ID does not exist" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.647213 4975 scope.go:117] "RemoveContainer" containerID="478fa8c0f210ab3779ca88d517690908ac3aec64b8c063aa7f96a342a467df93" Mar 18 12:34:55 crc kubenswrapper[4975]: E0318 12:34:55.647507 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"478fa8c0f210ab3779ca88d517690908ac3aec64b8c063aa7f96a342a467df93\": container with ID starting with 478fa8c0f210ab3779ca88d517690908ac3aec64b8c063aa7f96a342a467df93 not found: ID does not exist" containerID="478fa8c0f210ab3779ca88d517690908ac3aec64b8c063aa7f96a342a467df93" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.647522 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478fa8c0f210ab3779ca88d517690908ac3aec64b8c063aa7f96a342a467df93"} err="failed to get container status \"478fa8c0f210ab3779ca88d517690908ac3aec64b8c063aa7f96a342a467df93\": rpc error: code = NotFound desc = could not find container \"478fa8c0f210ab3779ca88d517690908ac3aec64b8c063aa7f96a342a467df93\": container with ID starting with 478fa8c0f210ab3779ca88d517690908ac3aec64b8c063aa7f96a342a467df93 not found: ID does not exist" Mar 18 12:34:55 crc kubenswrapper[4975]: I0318 12:34:55.826256 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 12:34:56 crc kubenswrapper[4975]: I0318 12:34:56.211043 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-c478d4794-x2t7q" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:35116->10.217.0.154:8443: read: connection reset by peer" Mar 18 12:34:56 crc kubenswrapper[4975]: I0318 12:34:56.599971 4975 generic.go:334] "Generic (PLEG): container finished" podID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerID="085c3bcf8fe1108084d797f0f69756fc886932b4a0540f844652adcd1deb2117" exitCode=0 Mar 18 12:34:56 crc kubenswrapper[4975]: I0318 12:34:56.600028 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c478d4794-x2t7q" event={"ID":"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7","Type":"ContainerDied","Data":"085c3bcf8fe1108084d797f0f69756fc886932b4a0540f844652adcd1deb2117"} Mar 18 12:34:56 crc kubenswrapper[4975]: I0318 12:34:56.601764 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"eedc82f6-6487-41cf-b618-db58be6f1eed","Type":"ContainerStarted","Data":"d88591db68f1d2f8a52454270e68256811850333fd0d2ec89bd371b7a60da039"} Mar 18 12:34:56 crc kubenswrapper[4975]: I0318 12:34:56.680480 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-c478d4794-x2t7q" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 18 12:34:57 crc kubenswrapper[4975]: I0318 12:34:57.031640 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bae225c-6f49-427e-b276-8b0ffdff525f" path="/var/lib/kubelet/pods/5bae225c-6f49-427e-b276-8b0ffdff525f/volumes" Mar 18 12:34:57 crc kubenswrapper[4975]: I0318 12:34:57.205024 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.107485 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-749bf8fbcf-mc9c6"] Mar 18 12:35:00 crc kubenswrapper[4975]: E0318 12:35:00.108396 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bae225c-6f49-427e-b276-8b0ffdff525f" containerName="barbican-api" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.108410 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bae225c-6f49-427e-b276-8b0ffdff525f" containerName="barbican-api" Mar 18 12:35:00 crc kubenswrapper[4975]: E0318 12:35:00.108435 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bae225c-6f49-427e-b276-8b0ffdff525f" containerName="barbican-api-log" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.108443 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bae225c-6f49-427e-b276-8b0ffdff525f" containerName="barbican-api-log" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.108643 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bae225c-6f49-427e-b276-8b0ffdff525f" containerName="barbican-api" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.108671 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bae225c-6f49-427e-b276-8b0ffdff525f" containerName="barbican-api-log" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.109611 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.113285 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.113559 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.113715 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.122044 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-749bf8fbcf-mc9c6"] Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.239249 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-internal-tls-certs\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.239333 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-etc-swift\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.239376 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-public-tls-certs\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.240004 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-log-httpd\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.240045 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-run-httpd\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.240115 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28j8l\" (UniqueName: \"kubernetes.io/projected/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-kube-api-access-28j8l\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.240142 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-config-data\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.240231 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-combined-ca-bundle\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.342025 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-internal-tls-certs\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.342083 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-etc-swift\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.342114 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-public-tls-certs\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.342151 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-log-httpd\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.342171 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-run-httpd\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.342215 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28j8l\" (UniqueName: \"kubernetes.io/projected/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-kube-api-access-28j8l\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.342238 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-config-data\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.342293 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-combined-ca-bundle\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.342595 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-log-httpd\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.342723 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-run-httpd\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.347424 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-combined-ca-bundle\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.348226 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-internal-tls-certs\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.348357 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-config-data\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.349015 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-public-tls-certs\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.351562 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-etc-swift\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.357560 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28j8l\" (UniqueName: \"kubernetes.io/projected/0f1ae896-bd35-40e2-bd0f-35cf15db5e2d-kube-api-access-28j8l\") pod \"swift-proxy-749bf8fbcf-mc9c6\" (UID: \"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d\") " pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:00 crc kubenswrapper[4975]: I0318 12:35:00.440838 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:01 crc kubenswrapper[4975]: I0318 12:35:01.039836 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-749bf8fbcf-mc9c6"] Mar 18 12:35:01 crc kubenswrapper[4975]: I0318 12:35:01.447516 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:01 crc kubenswrapper[4975]: I0318 12:35:01.449401 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="ceilometer-central-agent" containerID="cri-o://7ffc4eeaee0df340d88f01c5618567ead919fe55902f919405a028af2b1fd5f5" gracePeriod=30 Mar 18 12:35:01 crc kubenswrapper[4975]: I0318 12:35:01.449553 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="proxy-httpd" containerID="cri-o://d676e583a72c57ce4890b890edfb4a46c6d795d8667e9a5c1cefaa2034825a63" gracePeriod=30 Mar 18 12:35:01 crc kubenswrapper[4975]: I0318 12:35:01.449620 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="sg-core" containerID="cri-o://6a838c075b2bea6c6be343c67c08d11d75024c0fc01e6722a50dff5e4aea1e35" gracePeriod=30 Mar 18 12:35:01 crc kubenswrapper[4975]: I0318 12:35:01.449652 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="ceilometer-notification-agent" containerID="cri-o://5f8d5b3ce07a0b3916b649e9a7c24e65c83acc926b9cacb2a890136d6ee2ae68" gracePeriod=30 Mar 18 12:35:01 crc kubenswrapper[4975]: I0318 12:35:01.462632 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 12:35:01 crc kubenswrapper[4975]: I0318 12:35:01.677085 4975 generic.go:334] "Generic (PLEG): container finished" podID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerID="6a838c075b2bea6c6be343c67c08d11d75024c0fc01e6722a50dff5e4aea1e35" exitCode=2 Mar 18 12:35:01 crc kubenswrapper[4975]: I0318 12:35:01.677126 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d","Type":"ContainerDied","Data":"6a838c075b2bea6c6be343c67c08d11d75024c0fc01e6722a50dff5e4aea1e35"} Mar 18 12:35:02 crc kubenswrapper[4975]: I0318 12:35:02.693472 4975 generic.go:334] "Generic (PLEG): container finished" podID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerID="d676e583a72c57ce4890b890edfb4a46c6d795d8667e9a5c1cefaa2034825a63" exitCode=0 Mar 18 12:35:02 crc kubenswrapper[4975]: I0318 12:35:02.693825 4975 generic.go:334] "Generic (PLEG): container finished" podID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerID="5f8d5b3ce07a0b3916b649e9a7c24e65c83acc926b9cacb2a890136d6ee2ae68" exitCode=0 Mar 18 12:35:02 crc kubenswrapper[4975]: I0318 12:35:02.693836 4975 generic.go:334] "Generic (PLEG): container finished" podID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerID="7ffc4eeaee0df340d88f01c5618567ead919fe55902f919405a028af2b1fd5f5" exitCode=0 Mar 18 12:35:02 crc kubenswrapper[4975]: I0318 12:35:02.693873 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d","Type":"ContainerDied","Data":"d676e583a72c57ce4890b890edfb4a46c6d795d8667e9a5c1cefaa2034825a63"} Mar 18 12:35:02 crc kubenswrapper[4975]: I0318 12:35:02.693905 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d","Type":"ContainerDied","Data":"5f8d5b3ce07a0b3916b649e9a7c24e65c83acc926b9cacb2a890136d6ee2ae68"} Mar 18 12:35:02 crc kubenswrapper[4975]: I0318 12:35:02.693919 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d","Type":"ContainerDied","Data":"7ffc4eeaee0df340d88f01c5618567ead919fe55902f919405a028af2b1fd5f5"} Mar 18 12:35:03 crc kubenswrapper[4975]: I0318 12:35:03.452931 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.172:3000/\": dial tcp 10.217.0.172:3000: connect: connection refused" Mar 18 12:35:06 crc kubenswrapper[4975]: I0318 12:35:06.680209 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-c478d4794-x2t7q" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 18 12:35:06 crc kubenswrapper[4975]: I0318 12:35:06.765030 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-749bf8fbcf-mc9c6" event={"ID":"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d","Type":"ContainerStarted","Data":"0b0288712b705e617723328241c9ba9fb3175eeeb0208a5308365ebb0f5c8196"} Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.243958 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hs8wx"] Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.245642 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hs8wx" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.261843 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hs8wx"] Mar 18 12:35:07 crc kubenswrapper[4975]: W0318 12:35:07.262837 4975 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1563020d_e0f9_4a33_bb8d_98dfb0ed9f0d.slice/crio-conmon-d676e583a72c57ce4890b890edfb4a46c6d795d8667e9a5c1cefaa2034825a63.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1563020d_e0f9_4a33_bb8d_98dfb0ed9f0d.slice/crio-conmon-d676e583a72c57ce4890b890edfb4a46c6d795d8667e9a5c1cefaa2034825a63.scope: no such file or directory Mar 18 12:35:07 crc kubenswrapper[4975]: W0318 12:35:07.263257 4975 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1563020d_e0f9_4a33_bb8d_98dfb0ed9f0d.slice/crio-d676e583a72c57ce4890b890edfb4a46c6d795d8667e9a5c1cefaa2034825a63.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1563020d_e0f9_4a33_bb8d_98dfb0ed9f0d.slice/crio-d676e583a72c57ce4890b890edfb4a46c6d795d8667e9a5c1cefaa2034825a63.scope: no such file or directory Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.360246 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fnll\" (UniqueName: \"kubernetes.io/projected/3f7b0dc3-44d1-4932-bca8-f4ade944ecbe-kube-api-access-5fnll\") pod \"nova-api-db-create-hs8wx\" (UID: \"3f7b0dc3-44d1-4932-bca8-f4ade944ecbe\") " pod="openstack/nova-api-db-create-hs8wx" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.360314 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f7b0dc3-44d1-4932-bca8-f4ade944ecbe-operator-scripts\") pod \"nova-api-db-create-hs8wx\" (UID: \"3f7b0dc3-44d1-4932-bca8-f4ade944ecbe\") " pod="openstack/nova-api-db-create-hs8wx" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.439299 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="028f5562-456a-4f96-a868-7c2a7aff3d3e" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.171:8776/healthcheck\": dial tcp 10.217.0.171:8776: connect: connection refused" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.453259 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-76hpr"] Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.456278 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-76hpr" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.462008 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fnll\" (UniqueName: \"kubernetes.io/projected/3f7b0dc3-44d1-4932-bca8-f4ade944ecbe-kube-api-access-5fnll\") pod \"nova-api-db-create-hs8wx\" (UID: \"3f7b0dc3-44d1-4932-bca8-f4ade944ecbe\") " pod="openstack/nova-api-db-create-hs8wx" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.462326 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f7b0dc3-44d1-4932-bca8-f4ade944ecbe-operator-scripts\") pod \"nova-api-db-create-hs8wx\" (UID: \"3f7b0dc3-44d1-4932-bca8-f4ade944ecbe\") " pod="openstack/nova-api-db-create-hs8wx" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.463161 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f7b0dc3-44d1-4932-bca8-f4ade944ecbe-operator-scripts\") pod \"nova-api-db-create-hs8wx\" (UID: \"3f7b0dc3-44d1-4932-bca8-f4ade944ecbe\") " pod="openstack/nova-api-db-create-hs8wx" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.473914 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-76hpr"] Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.482451 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db13-account-create-update-cmgxb"] Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.483877 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db13-account-create-update-cmgxb" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.487049 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.494743 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fnll\" (UniqueName: \"kubernetes.io/projected/3f7b0dc3-44d1-4932-bca8-f4ade944ecbe-kube-api-access-5fnll\") pod \"nova-api-db-create-hs8wx\" (UID: \"3f7b0dc3-44d1-4932-bca8-f4ade944ecbe\") " pod="openstack/nova-api-db-create-hs8wx" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.508548 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db13-account-create-update-cmgxb"] Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.550140 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-w9tlx"] Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.551830 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9tlx" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.557313 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.558909 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w9tlx"] Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.568294 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb70bc5f-531b-4c58-b84b-e8eb61d81340-operator-scripts\") pod \"nova-cell0-db-create-76hpr\" (UID: \"eb70bc5f-531b-4c58-b84b-e8eb61d81340\") " pod="openstack/nova-cell0-db-create-76hpr" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.568999 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hs8wx" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.569318 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmjp6\" (UniqueName: \"kubernetes.io/projected/eb70bc5f-531b-4c58-b84b-e8eb61d81340-kube-api-access-wmjp6\") pod \"nova-cell0-db-create-76hpr\" (UID: \"eb70bc5f-531b-4c58-b84b-e8eb61d81340\") " pod="openstack/nova-cell0-db-create-76hpr" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.668840 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-c3a7-account-create-update-djb44"] Mar 18 12:35:07 crc kubenswrapper[4975]: E0318 12:35:07.669426 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="ceilometer-central-agent" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.669451 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="ceilometer-central-agent" Mar 18 12:35:07 crc kubenswrapper[4975]: E0318 12:35:07.669472 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="sg-core" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.669481 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="sg-core" Mar 18 12:35:07 crc kubenswrapper[4975]: E0318 12:35:07.669499 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="proxy-httpd" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.669508 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="proxy-httpd" Mar 18 12:35:07 crc kubenswrapper[4975]: E0318 12:35:07.669526 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="ceilometer-notification-agent" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.669533 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="ceilometer-notification-agent" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.669788 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="sg-core" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.669811 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="ceilometer-notification-agent" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.669823 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="proxy-httpd" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.669843 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" containerName="ceilometer-central-agent" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.670727 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-run-httpd\") pod \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.670907 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-sg-core-conf-yaml\") pod \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.670956 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-combined-ca-bundle\") pod \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.671014 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ktqj\" (UniqueName: \"kubernetes.io/projected/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-kube-api-access-2ktqj\") pod \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.671054 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-config-data\") pod \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.671110 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-log-httpd\") pod \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.671161 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-scripts\") pod \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\" (UID: \"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d\") " Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.671411 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7-operator-scripts\") pod \"nova-cell1-db-create-w9tlx\" (UID: \"110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7\") " pod="openstack/nova-cell1-db-create-w9tlx" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.671458 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85qjg\" (UniqueName: \"kubernetes.io/projected/110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7-kube-api-access-85qjg\") pod \"nova-cell1-db-create-w9tlx\" (UID: \"110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7\") " pod="openstack/nova-cell1-db-create-w9tlx" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.671523 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmjp6\" (UniqueName: \"kubernetes.io/projected/eb70bc5f-531b-4c58-b84b-e8eb61d81340-kube-api-access-wmjp6\") pod \"nova-cell0-db-create-76hpr\" (UID: \"eb70bc5f-531b-4c58-b84b-e8eb61d81340\") " pod="openstack/nova-cell0-db-create-76hpr" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.671560 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb70bc5f-531b-4c58-b84b-e8eb61d81340-operator-scripts\") pod \"nova-cell0-db-create-76hpr\" (UID: \"eb70bc5f-531b-4c58-b84b-e8eb61d81340\") " pod="openstack/nova-cell0-db-create-76hpr" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.671688 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfn98\" (UniqueName: \"kubernetes.io/projected/481acd0e-80e8-416a-aec5-361567fd5bc6-kube-api-access-vfn98\") pod \"nova-api-db13-account-create-update-cmgxb\" (UID: \"481acd0e-80e8-416a-aec5-361567fd5bc6\") " pod="openstack/nova-api-db13-account-create-update-cmgxb" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.671722 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/481acd0e-80e8-416a-aec5-361567fd5bc6-operator-scripts\") pod \"nova-api-db13-account-create-update-cmgxb\" (UID: \"481acd0e-80e8-416a-aec5-361567fd5bc6\") " pod="openstack/nova-api-db13-account-create-update-cmgxb" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.672098 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c3a7-account-create-update-djb44" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.675259 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb70bc5f-531b-4c58-b84b-e8eb61d81340-operator-scripts\") pod \"nova-cell0-db-create-76hpr\" (UID: \"eb70bc5f-531b-4c58-b84b-e8eb61d81340\") " pod="openstack/nova-cell0-db-create-76hpr" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.677401 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" (UID: "1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.679040 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.682952 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" (UID: "1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.702416 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c3a7-account-create-update-djb44"] Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.708104 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmjp6\" (UniqueName: \"kubernetes.io/projected/eb70bc5f-531b-4c58-b84b-e8eb61d81340-kube-api-access-wmjp6\") pod \"nova-cell0-db-create-76hpr\" (UID: \"eb70bc5f-531b-4c58-b84b-e8eb61d81340\") " pod="openstack/nova-cell0-db-create-76hpr" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.708314 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-scripts" (OuterVolumeSpecName: "scripts") pod "1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" (UID: "1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.708583 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-kube-api-access-2ktqj" (OuterVolumeSpecName: "kube-api-access-2ktqj") pod "1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" (UID: "1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d"). InnerVolumeSpecName "kube-api-access-2ktqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.773466 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85qjg\" (UniqueName: \"kubernetes.io/projected/110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7-kube-api-access-85qjg\") pod \"nova-cell1-db-create-w9tlx\" (UID: \"110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7\") " pod="openstack/nova-cell1-db-create-w9tlx" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.773835 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfn98\" (UniqueName: \"kubernetes.io/projected/481acd0e-80e8-416a-aec5-361567fd5bc6-kube-api-access-vfn98\") pod \"nova-api-db13-account-create-update-cmgxb\" (UID: \"481acd0e-80e8-416a-aec5-361567fd5bc6\") " pod="openstack/nova-api-db13-account-create-update-cmgxb" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.773959 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/481acd0e-80e8-416a-aec5-361567fd5bc6-operator-scripts\") pod \"nova-api-db13-account-create-update-cmgxb\" (UID: \"481acd0e-80e8-416a-aec5-361567fd5bc6\") " pod="openstack/nova-api-db13-account-create-update-cmgxb" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.774068 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rs8l\" (UniqueName: \"kubernetes.io/projected/f3524634-0ee4-4add-a35b-a6ffdee6f1f6-kube-api-access-5rs8l\") pod \"nova-cell0-c3a7-account-create-update-djb44\" (UID: \"f3524634-0ee4-4add-a35b-a6ffdee6f1f6\") " pod="openstack/nova-cell0-c3a7-account-create-update-djb44" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.774158 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3524634-0ee4-4add-a35b-a6ffdee6f1f6-operator-scripts\") pod \"nova-cell0-c3a7-account-create-update-djb44\" (UID: \"f3524634-0ee4-4add-a35b-a6ffdee6f1f6\") " pod="openstack/nova-cell0-c3a7-account-create-update-djb44" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.774277 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7-operator-scripts\") pod \"nova-cell1-db-create-w9tlx\" (UID: \"110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7\") " pod="openstack/nova-cell1-db-create-w9tlx" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.774448 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.778549 4975 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.778764 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ktqj\" (UniqueName: \"kubernetes.io/projected/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-kube-api-access-2ktqj\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.776628 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/481acd0e-80e8-416a-aec5-361567fd5bc6-operator-scripts\") pod \"nova-api-db13-account-create-update-cmgxb\" (UID: \"481acd0e-80e8-416a-aec5-361567fd5bc6\") " pod="openstack/nova-api-db13-account-create-update-cmgxb" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.778474 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7-operator-scripts\") pod \"nova-cell1-db-create-w9tlx\" (UID: \"110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7\") " pod="openstack/nova-cell1-db-create-w9tlx" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.780414 4975 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.792451 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85qjg\" (UniqueName: \"kubernetes.io/projected/110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7-kube-api-access-85qjg\") pod \"nova-cell1-db-create-w9tlx\" (UID: \"110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7\") " pod="openstack/nova-cell1-db-create-w9tlx" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.798371 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfn98\" (UniqueName: \"kubernetes.io/projected/481acd0e-80e8-416a-aec5-361567fd5bc6-kube-api-access-vfn98\") pod \"nova-api-db13-account-create-update-cmgxb\" (UID: \"481acd0e-80e8-416a-aec5-361567fd5bc6\") " pod="openstack/nova-api-db13-account-create-update-cmgxb" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.798674 4975 generic.go:334] "Generic (PLEG): container finished" podID="028f5562-456a-4f96-a868-7c2a7aff3d3e" containerID="da767122d672519a4fc2d9585ab82059079e90f847478320c934e6490fd50754" exitCode=137 Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.798806 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"028f5562-456a-4f96-a868-7c2a7aff3d3e","Type":"ContainerDied","Data":"da767122d672519a4fc2d9585ab82059079e90f847478320c934e6490fd50754"} Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.819197 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" (UID: "1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.821338 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d","Type":"ContainerDied","Data":"5c075acc505640b0956ed5e4eaa147d08aa7d44358fb35ac701c8e5ec58e3b93"} Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.821823 4975 scope.go:117] "RemoveContainer" containerID="d676e583a72c57ce4890b890edfb4a46c6d795d8667e9a5c1cefaa2034825a63" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.825644 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.831239 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-749bf8fbcf-mc9c6" event={"ID":"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d","Type":"ContainerStarted","Data":"f03af34bd6a1e096048218e0ac3953a49f72aa9f792a08c8f42d319e30ad8d36"} Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.864316 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-d4d4-account-create-update-bzbn7"] Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.866358 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d4d4-account-create-update-bzbn7" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.874964 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.884587 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-config-data" (OuterVolumeSpecName: "config-data") pod "1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" (UID: "1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.886105 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rs8l\" (UniqueName: \"kubernetes.io/projected/f3524634-0ee4-4add-a35b-a6ffdee6f1f6-kube-api-access-5rs8l\") pod \"nova-cell0-c3a7-account-create-update-djb44\" (UID: \"f3524634-0ee4-4add-a35b-a6ffdee6f1f6\") " pod="openstack/nova-cell0-c3a7-account-create-update-djb44" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.886148 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3524634-0ee4-4add-a35b-a6ffdee6f1f6-operator-scripts\") pod \"nova-cell0-c3a7-account-create-update-djb44\" (UID: \"f3524634-0ee4-4add-a35b-a6ffdee6f1f6\") " pod="openstack/nova-cell0-c3a7-account-create-update-djb44" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.886425 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.886447 4975 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.887313 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3524634-0ee4-4add-a35b-a6ffdee6f1f6-operator-scripts\") pod \"nova-cell0-c3a7-account-create-update-djb44\" (UID: \"f3524634-0ee4-4add-a35b-a6ffdee6f1f6\") " pod="openstack/nova-cell0-c3a7-account-create-update-djb44" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.896897 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d4d4-account-create-update-bzbn7"] Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.917122 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rs8l\" (UniqueName: \"kubernetes.io/projected/f3524634-0ee4-4add-a35b-a6ffdee6f1f6-kube-api-access-5rs8l\") pod \"nova-cell0-c3a7-account-create-update-djb44\" (UID: \"f3524634-0ee4-4add-a35b-a6ffdee6f1f6\") " pod="openstack/nova-cell0-c3a7-account-create-update-djb44" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.922802 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" (UID: "1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.932622 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db13-account-create-update-cmgxb" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.945163 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-76hpr" Mar 18 12:35:07 crc kubenswrapper[4975]: I0318 12:35:07.969324 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9tlx" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.012678 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22cd6088-9800-4d88-b130-fdc6a3dd4e90-operator-scripts\") pod \"nova-cell1-d4d4-account-create-update-bzbn7\" (UID: \"22cd6088-9800-4d88-b130-fdc6a3dd4e90\") " pod="openstack/nova-cell1-d4d4-account-create-update-bzbn7" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.013630 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c3a7-account-create-update-djb44" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.015158 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvzgf\" (UniqueName: \"kubernetes.io/projected/22cd6088-9800-4d88-b130-fdc6a3dd4e90-kube-api-access-zvzgf\") pod \"nova-cell1-d4d4-account-create-update-bzbn7\" (UID: \"22cd6088-9800-4d88-b130-fdc6a3dd4e90\") " pod="openstack/nova-cell1-d4d4-account-create-update-bzbn7" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.038559 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.048213 4975 scope.go:117] "RemoveContainer" containerID="6a838c075b2bea6c6be343c67c08d11d75024c0fc01e6722a50dff5e4aea1e35" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.115126 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.144130 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-config-data\") pod \"028f5562-456a-4f96-a868-7c2a7aff3d3e\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.144244 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/028f5562-456a-4f96-a868-7c2a7aff3d3e-logs\") pod \"028f5562-456a-4f96-a868-7c2a7aff3d3e\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.144302 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzvbl\" (UniqueName: \"kubernetes.io/projected/028f5562-456a-4f96-a868-7c2a7aff3d3e-kube-api-access-vzvbl\") pod \"028f5562-456a-4f96-a868-7c2a7aff3d3e\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.144370 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-config-data-custom\") pod \"028f5562-456a-4f96-a868-7c2a7aff3d3e\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.144493 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-scripts\") pod \"028f5562-456a-4f96-a868-7c2a7aff3d3e\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.144573 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/028f5562-456a-4f96-a868-7c2a7aff3d3e-etc-machine-id\") pod \"028f5562-456a-4f96-a868-7c2a7aff3d3e\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.144629 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-combined-ca-bundle\") pod \"028f5562-456a-4f96-a868-7c2a7aff3d3e\" (UID: \"028f5562-456a-4f96-a868-7c2a7aff3d3e\") " Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.148052 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22cd6088-9800-4d88-b130-fdc6a3dd4e90-operator-scripts\") pod \"nova-cell1-d4d4-account-create-update-bzbn7\" (UID: \"22cd6088-9800-4d88-b130-fdc6a3dd4e90\") " pod="openstack/nova-cell1-d4d4-account-create-update-bzbn7" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.148469 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvzgf\" (UniqueName: \"kubernetes.io/projected/22cd6088-9800-4d88-b130-fdc6a3dd4e90-kube-api-access-zvzgf\") pod \"nova-cell1-d4d4-account-create-update-bzbn7\" (UID: \"22cd6088-9800-4d88-b130-fdc6a3dd4e90\") " pod="openstack/nova-cell1-d4d4-account-create-update-bzbn7" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.150462 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22cd6088-9800-4d88-b130-fdc6a3dd4e90-operator-scripts\") pod \"nova-cell1-d4d4-account-create-update-bzbn7\" (UID: \"22cd6088-9800-4d88-b130-fdc6a3dd4e90\") " pod="openstack/nova-cell1-d4d4-account-create-update-bzbn7" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.150641 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/028f5562-456a-4f96-a868-7c2a7aff3d3e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "028f5562-456a-4f96-a868-7c2a7aff3d3e" (UID: "028f5562-456a-4f96-a868-7c2a7aff3d3e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.150961 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/028f5562-456a-4f96-a868-7c2a7aff3d3e-logs" (OuterVolumeSpecName: "logs") pod "028f5562-456a-4f96-a868-7c2a7aff3d3e" (UID: "028f5562-456a-4f96-a868-7c2a7aff3d3e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.166829 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "028f5562-456a-4f96-a868-7c2a7aff3d3e" (UID: "028f5562-456a-4f96-a868-7c2a7aff3d3e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.172163 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/028f5562-456a-4f96-a868-7c2a7aff3d3e-kube-api-access-vzvbl" (OuterVolumeSpecName: "kube-api-access-vzvbl") pod "028f5562-456a-4f96-a868-7c2a7aff3d3e" (UID: "028f5562-456a-4f96-a868-7c2a7aff3d3e"). InnerVolumeSpecName "kube-api-access-vzvbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.177398 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvzgf\" (UniqueName: \"kubernetes.io/projected/22cd6088-9800-4d88-b130-fdc6a3dd4e90-kube-api-access-zvzgf\") pod \"nova-cell1-d4d4-account-create-update-bzbn7\" (UID: \"22cd6088-9800-4d88-b130-fdc6a3dd4e90\") " pod="openstack/nova-cell1-d4d4-account-create-update-bzbn7" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.178031 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-scripts" (OuterVolumeSpecName: "scripts") pod "028f5562-456a-4f96-a868-7c2a7aff3d3e" (UID: "028f5562-456a-4f96-a868-7c2a7aff3d3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.180822 4975 scope.go:117] "RemoveContainer" containerID="5f8d5b3ce07a0b3916b649e9a7c24e65c83acc926b9cacb2a890136d6ee2ae68" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.200548 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.217100 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "028f5562-456a-4f96-a868-7c2a7aff3d3e" (UID: "028f5562-456a-4f96-a868-7c2a7aff3d3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.230796 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.250547 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.250686 4975 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/028f5562-456a-4f96-a868-7c2a7aff3d3e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.250719 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.250732 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/028f5562-456a-4f96-a868-7c2a7aff3d3e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.250743 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzvbl\" (UniqueName: \"kubernetes.io/projected/028f5562-456a-4f96-a868-7c2a7aff3d3e-kube-api-access-vzvbl\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.250758 4975 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.250768 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:08 crc kubenswrapper[4975]: E0318 12:35:08.251031 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028f5562-456a-4f96-a868-7c2a7aff3d3e" containerName="cinder-api" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.251047 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="028f5562-456a-4f96-a868-7c2a7aff3d3e" containerName="cinder-api" Mar 18 12:35:08 crc kubenswrapper[4975]: E0318 12:35:08.251058 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028f5562-456a-4f96-a868-7c2a7aff3d3e" containerName="cinder-api-log" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.251064 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="028f5562-456a-4f96-a868-7c2a7aff3d3e" containerName="cinder-api-log" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.252101 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-config-data" (OuterVolumeSpecName: "config-data") pod "028f5562-456a-4f96-a868-7c2a7aff3d3e" (UID: "028f5562-456a-4f96-a868-7c2a7aff3d3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.252279 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="028f5562-456a-4f96-a868-7c2a7aff3d3e" containerName="cinder-api-log" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.252358 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="028f5562-456a-4f96-a868-7c2a7aff3d3e" containerName="cinder-api" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.255542 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.258069 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.261475 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.287147 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.298331 4975 scope.go:117] "RemoveContainer" containerID="7ffc4eeaee0df340d88f01c5618567ead919fe55902f919405a028af2b1fd5f5" Mar 18 12:35:08 crc kubenswrapper[4975]: W0318 12:35:08.315212 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f7b0dc3_44d1_4932_bca8_f4ade944ecbe.slice/crio-1f4928bde6ccd56a3ed24f90252729b1ea94811406f43bae062cb2350a87070e WatchSource:0}: Error finding container 1f4928bde6ccd56a3ed24f90252729b1ea94811406f43bae062cb2350a87070e: Status 404 returned error can't find the container with id 1f4928bde6ccd56a3ed24f90252729b1ea94811406f43bae062cb2350a87070e Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.329282 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hs8wx"] Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.335854 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d4d4-account-create-update-bzbn7" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.352198 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-scripts\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.352491 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53e0412-9b5d-4f27-b90b-5b44d02d419b-log-httpd\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.352587 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmtfj\" (UniqueName: \"kubernetes.io/projected/f53e0412-9b5d-4f27-b90b-5b44d02d419b-kube-api-access-xmtfj\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.352713 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-config-data\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.352793 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53e0412-9b5d-4f27-b90b-5b44d02d419b-run-httpd\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.352899 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.353838 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.354243 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/028f5562-456a-4f96-a868-7c2a7aff3d3e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.456263 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53e0412-9b5d-4f27-b90b-5b44d02d419b-log-httpd\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.456314 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmtfj\" (UniqueName: \"kubernetes.io/projected/f53e0412-9b5d-4f27-b90b-5b44d02d419b-kube-api-access-xmtfj\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.456359 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-config-data\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.456390 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53e0412-9b5d-4f27-b90b-5b44d02d419b-run-httpd\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.456425 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.456451 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.456518 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-scripts\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.457413 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53e0412-9b5d-4f27-b90b-5b44d02d419b-run-httpd\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.458486 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53e0412-9b5d-4f27-b90b-5b44d02d419b-log-httpd\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.462165 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.462885 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-scripts\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.463421 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.468034 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-config-data\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.478802 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmtfj\" (UniqueName: \"kubernetes.io/projected/f53e0412-9b5d-4f27-b90b-5b44d02d419b-kube-api-access-xmtfj\") pod \"ceilometer-0\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.605630 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.780545 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-76hpr"] Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.802799 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c3a7-account-create-update-djb44"] Mar 18 12:35:08 crc kubenswrapper[4975]: W0318 12:35:08.825311 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3524634_0ee4_4add_a35b_a6ffdee6f1f6.slice/crio-217ec70c8b0b4cc913c2e64b2653c9f52a11ecbc5a6c56228d6c41eed542b924 WatchSource:0}: Error finding container 217ec70c8b0b4cc913c2e64b2653c9f52a11ecbc5a6c56228d6c41eed542b924: Status 404 returned error can't find the container with id 217ec70c8b0b4cc913c2e64b2653c9f52a11ecbc5a6c56228d6c41eed542b924 Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.853898 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hs8wx" event={"ID":"3f7b0dc3-44d1-4932-bca8-f4ade944ecbe","Type":"ContainerStarted","Data":"1f4928bde6ccd56a3ed24f90252729b1ea94811406f43bae062cb2350a87070e"} Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.855392 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c3a7-account-create-update-djb44" event={"ID":"f3524634-0ee4-4add-a35b-a6ffdee6f1f6","Type":"ContainerStarted","Data":"217ec70c8b0b4cc913c2e64b2653c9f52a11ecbc5a6c56228d6c41eed542b924"} Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.857322 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"eedc82f6-6487-41cf-b618-db58be6f1eed","Type":"ContainerStarted","Data":"948525e98c2661c1dcf6c455b8fcdf68d82276aa2320ff8b82ed1b1993509119"} Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.860860 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-76hpr" event={"ID":"eb70bc5f-531b-4c58-b84b-e8eb61d81340","Type":"ContainerStarted","Data":"a728a50790a0d46f7100de5b5b1ff28ac0c4bf7b3bacc5a4cc70dd9922c513ef"} Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.881913 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"028f5562-456a-4f96-a868-7c2a7aff3d3e","Type":"ContainerDied","Data":"ab0f6b4129e8c91eb64ce36bf2976c287373c5e18d69d50c2af409a47bbbbb3c"} Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.881958 4975 scope.go:117] "RemoveContainer" containerID="da767122d672519a4fc2d9585ab82059079e90f847478320c934e6490fd50754" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.882058 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.928147 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db13-account-create-update-cmgxb"] Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.967549 4975 scope.go:117] "RemoveContainer" containerID="7b91981ea53efd61f45f259d4fbd9b74f79839543a2c37dec8fa396a1f9f9910" Mar 18 12:35:08 crc kubenswrapper[4975]: I0318 12:35:08.984902 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.003789 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.043168 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="028f5562-456a-4f96-a868-7c2a7aff3d3e" path="/var/lib/kubelet/pods/028f5562-456a-4f96-a868-7c2a7aff3d3e/volumes" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.044138 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d" path="/var/lib/kubelet/pods/1563020d-e0f9-4a33-bb8d-98dfb0ed9f0d/volumes" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.045308 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w9tlx"] Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.062135 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.063824 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.066536 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.066890 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.078784 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.083035 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.097094 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d4d4-account-create-update-bzbn7"] Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.178000 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.178042 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.178074 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.178132 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.178151 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-logs\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.178204 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-scripts\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.178228 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppl6j\" (UniqueName: \"kubernetes.io/projected/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-kube-api-access-ppl6j\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.178422 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-config-data\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.178487 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-config-data-custom\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.253271 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.280483 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.280530 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.280563 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.280599 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.280620 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-logs\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.280702 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-scripts\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.280736 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppl6j\" (UniqueName: \"kubernetes.io/projected/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-kube-api-access-ppl6j\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.280798 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-config-data\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.280820 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-config-data-custom\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.281429 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.282041 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-logs\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.290794 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.291019 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-config-data\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.293436 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-scripts\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.293448 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.293960 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-config-data-custom\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.297273 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.301338 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppl6j\" (UniqueName: \"kubernetes.io/projected/9f32d1b3-ee20-4ad3-a943-a83c87014cd0-kube-api-access-ppl6j\") pod \"cinder-api-0\" (UID: \"9f32d1b3-ee20-4ad3-a943-a83c87014cd0\") " pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: W0318 12:35:09.357355 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf53e0412_9b5d_4f27_b90b_5b44d02d419b.slice/crio-fbec5e2d22a9bc37f9d2b32114f1de9aee694031b37f787d39e0d97efc709237 WatchSource:0}: Error finding container fbec5e2d22a9bc37f9d2b32114f1de9aee694031b37f787d39e0d97efc709237: Status 404 returned error can't find the container with id fbec5e2d22a9bc37f9d2b32114f1de9aee694031b37f787d39e0d97efc709237 Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.358996 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.400700 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.834756 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67fdc8889-2cm4h" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.900269 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.934289 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-749bf8fbcf-mc9c6" event={"ID":"0f1ae896-bd35-40e2-bd0f-35cf15db5e2d","Type":"ContainerStarted","Data":"83d90e2626ce9f2c79f3d8bdbabeb94e23e5ea53d55d3ae2741146e4aa7d82e2"} Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.951967 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.952094 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.952109 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9tlx" event={"ID":"110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7","Type":"ContainerStarted","Data":"34ebf7676cccf02b56a0c82eb244d1f089a7da2bb1ba8030bd159c48fc00ebde"} Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.952140 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5899b99ff6-cwt84"] Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.952165 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9tlx" event={"ID":"110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7","Type":"ContainerStarted","Data":"9ede004d37e6be0c5fe5adf8305086a90b5c43ed82ad30ef92d763f1aa12506f"} Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.952178 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db13-account-create-update-cmgxb" event={"ID":"481acd0e-80e8-416a-aec5-361567fd5bc6","Type":"ContainerStarted","Data":"09198912bf9136533c9e0e45a2bf948ca2e18c60bff7f983503edb4275f2f94e"} Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.952190 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db13-account-create-update-cmgxb" event={"ID":"481acd0e-80e8-416a-aec5-361567fd5bc6","Type":"ContainerStarted","Data":"3ccd85d8bdcd1a3b5a42a441f947551a3f85e0acef25e04c7e9a021df14dd12e"} Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.952571 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5899b99ff6-cwt84" podUID="48f1daf8-3604-40e4-9e41-e9025c083c7d" containerName="neutron-httpd" containerID="cri-o://54d62d06308503ce21e7991ebc2c1733e6936b83aaf6e8787b677c240f317b21" gracePeriod=30 Mar 18 12:35:09 crc kubenswrapper[4975]: I0318 12:35:09.952733 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5899b99ff6-cwt84" podUID="48f1daf8-3604-40e4-9e41-e9025c083c7d" containerName="neutron-api" containerID="cri-o://589ca94554a2c2a06045888f589afc811ca3abd6286fee87ebd2dc6e376739cf" gracePeriod=30 Mar 18 12:35:10 crc kubenswrapper[4975]: I0318 12:35:10.001608 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-76hpr" event={"ID":"eb70bc5f-531b-4c58-b84b-e8eb61d81340","Type":"ContainerStarted","Data":"abcded235c136b47c567608e1cfb19a7e24f9654c55e9aae98d0df6fe1079853"} Mar 18 12:35:10 crc kubenswrapper[4975]: I0318 12:35:10.027939 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d4d4-account-create-update-bzbn7" event={"ID":"22cd6088-9800-4d88-b130-fdc6a3dd4e90","Type":"ContainerStarted","Data":"c24bc886fe76fee861d9e3784ac38bf79719db5373039449e0204aa012969f6a"} Mar 18 12:35:10 crc kubenswrapper[4975]: I0318 12:35:10.027978 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d4d4-account-create-update-bzbn7" event={"ID":"22cd6088-9800-4d88-b130-fdc6a3dd4e90","Type":"ContainerStarted","Data":"8c15543eb2ebc55e4ddbea7b3373a8b92a9dbbf543bdff5d3cfd5590d553db82"} Mar 18 12:35:10 crc kubenswrapper[4975]: I0318 12:35:10.034560 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53e0412-9b5d-4f27-b90b-5b44d02d419b","Type":"ContainerStarted","Data":"fbec5e2d22a9bc37f9d2b32114f1de9aee694031b37f787d39e0d97efc709237"} Mar 18 12:35:10 crc kubenswrapper[4975]: I0318 12:35:10.054229 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-749bf8fbcf-mc9c6" podStartSLOduration=10.054199767 podStartE2EDuration="10.054199767s" podCreationTimestamp="2026-03-18 12:35:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:10.02453934 +0000 UTC m=+1495.738939929" watchObservedRunningTime="2026-03-18 12:35:10.054199767 +0000 UTC m=+1495.768600346" Mar 18 12:35:10 crc kubenswrapper[4975]: I0318 12:35:10.055957 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hs8wx" event={"ID":"3f7b0dc3-44d1-4932-bca8-f4ade944ecbe","Type":"ContainerStarted","Data":"359ca5ac16a74f6c2cba2b24070a22d712744b29ede7976f9ebfac0912a6a95b"} Mar 18 12:35:10 crc kubenswrapper[4975]: I0318 12:35:10.061355 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c3a7-account-create-update-djb44" event={"ID":"f3524634-0ee4-4add-a35b-a6ffdee6f1f6","Type":"ContainerStarted","Data":"d862336df0d2add7d6b7168aa9378d05c115f93f4389fe2f6e696f0b7b82996c"} Mar 18 12:35:10 crc kubenswrapper[4975]: I0318 12:35:10.066105 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-76hpr" podStartSLOduration=3.066082004 podStartE2EDuration="3.066082004s" podCreationTimestamp="2026-03-18 12:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:10.054971148 +0000 UTC m=+1495.769371727" watchObservedRunningTime="2026-03-18 12:35:10.066082004 +0000 UTC m=+1495.780482583" Mar 18 12:35:10 crc kubenswrapper[4975]: I0318 12:35:10.105591 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db13-account-create-update-cmgxb" podStartSLOduration=3.105571622 podStartE2EDuration="3.105571622s" podCreationTimestamp="2026-03-18 12:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:10.092281206 +0000 UTC m=+1495.806681785" watchObservedRunningTime="2026-03-18 12:35:10.105571622 +0000 UTC m=+1495.819972201" Mar 18 12:35:10 crc kubenswrapper[4975]: I0318 12:35:10.124259 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-w9tlx" podStartSLOduration=3.124233626 podStartE2EDuration="3.124233626s" podCreationTimestamp="2026-03-18 12:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:10.117590493 +0000 UTC m=+1495.831991072" watchObservedRunningTime="2026-03-18 12:35:10.124233626 +0000 UTC m=+1495.838634205" Mar 18 12:35:10 crc kubenswrapper[4975]: I0318 12:35:10.162720 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.298813659 podStartE2EDuration="16.162698525s" podCreationTimestamp="2026-03-18 12:34:54 +0000 UTC" firstStartedPulling="2026-03-18 12:34:55.834675542 +0000 UTC m=+1481.549076121" lastFinishedPulling="2026-03-18 12:35:07.698560398 +0000 UTC m=+1493.412960987" observedRunningTime="2026-03-18 12:35:10.137657916 +0000 UTC m=+1495.852058505" watchObservedRunningTime="2026-03-18 12:35:10.162698525 +0000 UTC m=+1495.877099104" Mar 18 12:35:10 crc kubenswrapper[4975]: I0318 12:35:10.185646 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-hs8wx" podStartSLOduration=3.185630967 podStartE2EDuration="3.185630967s" podCreationTimestamp="2026-03-18 12:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:10.155782745 +0000 UTC m=+1495.870183334" watchObservedRunningTime="2026-03-18 12:35:10.185630967 +0000 UTC m=+1495.900031546" Mar 18 12:35:10 crc kubenswrapper[4975]: I0318 12:35:10.205821 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-c3a7-account-create-update-djb44" podStartSLOduration=3.205798592 podStartE2EDuration="3.205798592s" podCreationTimestamp="2026-03-18 12:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:10.184858916 +0000 UTC m=+1495.899259495" watchObservedRunningTime="2026-03-18 12:35:10.205798592 +0000 UTC m=+1495.920199181" Mar 18 12:35:10 crc kubenswrapper[4975]: I0318 12:35:10.218051 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-d4d4-account-create-update-bzbn7" podStartSLOduration=3.218011379 podStartE2EDuration="3.218011379s" podCreationTimestamp="2026-03-18 12:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:10.200720713 +0000 UTC m=+1495.915121292" watchObservedRunningTime="2026-03-18 12:35:10.218011379 +0000 UTC m=+1495.932411958" Mar 18 12:35:11 crc kubenswrapper[4975]: I0318 12:35:11.071148 4975 generic.go:334] "Generic (PLEG): container finished" podID="eb70bc5f-531b-4c58-b84b-e8eb61d81340" containerID="abcded235c136b47c567608e1cfb19a7e24f9654c55e9aae98d0df6fe1079853" exitCode=0 Mar 18 12:35:11 crc kubenswrapper[4975]: I0318 12:35:11.071499 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-76hpr" event={"ID":"eb70bc5f-531b-4c58-b84b-e8eb61d81340","Type":"ContainerDied","Data":"abcded235c136b47c567608e1cfb19a7e24f9654c55e9aae98d0df6fe1079853"} Mar 18 12:35:11 crc kubenswrapper[4975]: I0318 12:35:11.079668 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9f32d1b3-ee20-4ad3-a943-a83c87014cd0","Type":"ContainerStarted","Data":"287fdffa75cbfd308639d0a6fd31a668bf57362c1ec78d4dea532a7b2ccaaedc"} Mar 18 12:35:11 crc kubenswrapper[4975]: I0318 12:35:11.085490 4975 generic.go:334] "Generic (PLEG): container finished" podID="110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7" containerID="34ebf7676cccf02b56a0c82eb244d1f089a7da2bb1ba8030bd159c48fc00ebde" exitCode=0 Mar 18 12:35:11 crc kubenswrapper[4975]: I0318 12:35:11.085672 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9tlx" event={"ID":"110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7","Type":"ContainerDied","Data":"34ebf7676cccf02b56a0c82eb244d1f089a7da2bb1ba8030bd159c48fc00ebde"} Mar 18 12:35:11 crc kubenswrapper[4975]: I0318 12:35:11.089999 4975 generic.go:334] "Generic (PLEG): container finished" podID="48f1daf8-3604-40e4-9e41-e9025c083c7d" containerID="54d62d06308503ce21e7991ebc2c1733e6936b83aaf6e8787b677c240f317b21" exitCode=0 Mar 18 12:35:11 crc kubenswrapper[4975]: I0318 12:35:11.090121 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5899b99ff6-cwt84" event={"ID":"48f1daf8-3604-40e4-9e41-e9025c083c7d","Type":"ContainerDied","Data":"54d62d06308503ce21e7991ebc2c1733e6936b83aaf6e8787b677c240f317b21"} Mar 18 12:35:11 crc kubenswrapper[4975]: I0318 12:35:11.100083 4975 generic.go:334] "Generic (PLEG): container finished" podID="3f7b0dc3-44d1-4932-bca8-f4ade944ecbe" containerID="359ca5ac16a74f6c2cba2b24070a22d712744b29ede7976f9ebfac0912a6a95b" exitCode=0 Mar 18 12:35:11 crc kubenswrapper[4975]: I0318 12:35:11.100407 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hs8wx" event={"ID":"3f7b0dc3-44d1-4932-bca8-f4ade944ecbe","Type":"ContainerDied","Data":"359ca5ac16a74f6c2cba2b24070a22d712744b29ede7976f9ebfac0912a6a95b"} Mar 18 12:35:11 crc kubenswrapper[4975]: I0318 12:35:11.127133 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-749bf8fbcf-mc9c6" podUID="0f1ae896-bd35-40e2-bd0f-35cf15db5e2d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.126298 4975 generic.go:334] "Generic (PLEG): container finished" podID="481acd0e-80e8-416a-aec5-361567fd5bc6" containerID="09198912bf9136533c9e0e45a2bf948ca2e18c60bff7f983503edb4275f2f94e" exitCode=0 Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.126709 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db13-account-create-update-cmgxb" event={"ID":"481acd0e-80e8-416a-aec5-361567fd5bc6","Type":"ContainerDied","Data":"09198912bf9136533c9e0e45a2bf948ca2e18c60bff7f983503edb4275f2f94e"} Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.132678 4975 generic.go:334] "Generic (PLEG): container finished" podID="f3524634-0ee4-4add-a35b-a6ffdee6f1f6" containerID="d862336df0d2add7d6b7168aa9378d05c115f93f4389fe2f6e696f0b7b82996c" exitCode=0 Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.132760 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c3a7-account-create-update-djb44" event={"ID":"f3524634-0ee4-4add-a35b-a6ffdee6f1f6","Type":"ContainerDied","Data":"d862336df0d2add7d6b7168aa9378d05c115f93f4389fe2f6e696f0b7b82996c"} Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.135524 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9f32d1b3-ee20-4ad3-a943-a83c87014cd0","Type":"ContainerStarted","Data":"9a8507bf0c2b3d4b4832ccf01b8e568005c2f0e38e6eda3cc801e19cc753dd56"} Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.135615 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9f32d1b3-ee20-4ad3-a943-a83c87014cd0","Type":"ContainerStarted","Data":"38752c886c5d5753dfd2070a125ec53f34e9ab915b34a98155bbd81fdb1fd3ea"} Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.139574 4975 generic.go:334] "Generic (PLEG): container finished" podID="22cd6088-9800-4d88-b130-fdc6a3dd4e90" containerID="c24bc886fe76fee861d9e3784ac38bf79719db5373039449e0204aa012969f6a" exitCode=0 Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.139644 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d4d4-account-create-update-bzbn7" event={"ID":"22cd6088-9800-4d88-b130-fdc6a3dd4e90","Type":"ContainerDied","Data":"c24bc886fe76fee861d9e3784ac38bf79719db5373039449e0204aa012969f6a"} Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.148537 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53e0412-9b5d-4f27-b90b-5b44d02d419b","Type":"ContainerStarted","Data":"af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0"} Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.165223 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.619694 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9tlx" Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.733995 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85qjg\" (UniqueName: \"kubernetes.io/projected/110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7-kube-api-access-85qjg\") pod \"110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7\" (UID: \"110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7\") " Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.734163 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7-operator-scripts\") pod \"110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7\" (UID: \"110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7\") " Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.735618 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7" (UID: "110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.744091 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7-kube-api-access-85qjg" (OuterVolumeSpecName: "kube-api-access-85qjg") pod "110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7" (UID: "110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7"). InnerVolumeSpecName "kube-api-access-85qjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.836697 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85qjg\" (UniqueName: \"kubernetes.io/projected/110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7-kube-api-access-85qjg\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.836734 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.969288 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hs8wx" Mar 18 12:35:12 crc kubenswrapper[4975]: I0318 12:35:12.976154 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-76hpr" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.040664 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmjp6\" (UniqueName: \"kubernetes.io/projected/eb70bc5f-531b-4c58-b84b-e8eb61d81340-kube-api-access-wmjp6\") pod \"eb70bc5f-531b-4c58-b84b-e8eb61d81340\" (UID: \"eb70bc5f-531b-4c58-b84b-e8eb61d81340\") " Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.040807 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fnll\" (UniqueName: \"kubernetes.io/projected/3f7b0dc3-44d1-4932-bca8-f4ade944ecbe-kube-api-access-5fnll\") pod \"3f7b0dc3-44d1-4932-bca8-f4ade944ecbe\" (UID: \"3f7b0dc3-44d1-4932-bca8-f4ade944ecbe\") " Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.040936 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb70bc5f-531b-4c58-b84b-e8eb61d81340-operator-scripts\") pod \"eb70bc5f-531b-4c58-b84b-e8eb61d81340\" (UID: \"eb70bc5f-531b-4c58-b84b-e8eb61d81340\") " Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.041011 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f7b0dc3-44d1-4932-bca8-f4ade944ecbe-operator-scripts\") pod \"3f7b0dc3-44d1-4932-bca8-f4ade944ecbe\" (UID: \"3f7b0dc3-44d1-4932-bca8-f4ade944ecbe\") " Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.041819 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f7b0dc3-44d1-4932-bca8-f4ade944ecbe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f7b0dc3-44d1-4932-bca8-f4ade944ecbe" (UID: "3f7b0dc3-44d1-4932-bca8-f4ade944ecbe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.042394 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb70bc5f-531b-4c58-b84b-e8eb61d81340-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb70bc5f-531b-4c58-b84b-e8eb61d81340" (UID: "eb70bc5f-531b-4c58-b84b-e8eb61d81340"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.048037 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb70bc5f-531b-4c58-b84b-e8eb61d81340-kube-api-access-wmjp6" (OuterVolumeSpecName: "kube-api-access-wmjp6") pod "eb70bc5f-531b-4c58-b84b-e8eb61d81340" (UID: "eb70bc5f-531b-4c58-b84b-e8eb61d81340"). InnerVolumeSpecName "kube-api-access-wmjp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.048085 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f7b0dc3-44d1-4932-bca8-f4ade944ecbe-kube-api-access-5fnll" (OuterVolumeSpecName: "kube-api-access-5fnll") pod "3f7b0dc3-44d1-4932-bca8-f4ade944ecbe" (UID: "3f7b0dc3-44d1-4932-bca8-f4ade944ecbe"). InnerVolumeSpecName "kube-api-access-5fnll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.145833 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb70bc5f-531b-4c58-b84b-e8eb61d81340-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.145907 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f7b0dc3-44d1-4932-bca8-f4ade944ecbe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.145922 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmjp6\" (UniqueName: \"kubernetes.io/projected/eb70bc5f-531b-4c58-b84b-e8eb61d81340-kube-api-access-wmjp6\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.145937 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fnll\" (UniqueName: \"kubernetes.io/projected/3f7b0dc3-44d1-4932-bca8-f4ade944ecbe-kube-api-access-5fnll\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.162190 4975 generic.go:334] "Generic (PLEG): container finished" podID="48f1daf8-3604-40e4-9e41-e9025c083c7d" containerID="589ca94554a2c2a06045888f589afc811ca3abd6286fee87ebd2dc6e376739cf" exitCode=0 Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.162245 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5899b99ff6-cwt84" event={"ID":"48f1daf8-3604-40e4-9e41-e9025c083c7d","Type":"ContainerDied","Data":"589ca94554a2c2a06045888f589afc811ca3abd6286fee87ebd2dc6e376739cf"} Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.164023 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hs8wx" event={"ID":"3f7b0dc3-44d1-4932-bca8-f4ade944ecbe","Type":"ContainerDied","Data":"1f4928bde6ccd56a3ed24f90252729b1ea94811406f43bae062cb2350a87070e"} Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.164050 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f4928bde6ccd56a3ed24f90252729b1ea94811406f43bae062cb2350a87070e" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.164112 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hs8wx" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.177056 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-76hpr" event={"ID":"eb70bc5f-531b-4c58-b84b-e8eb61d81340","Type":"ContainerDied","Data":"a728a50790a0d46f7100de5b5b1ff28ac0c4bf7b3bacc5a4cc70dd9922c513ef"} Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.177114 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a728a50790a0d46f7100de5b5b1ff28ac0c4bf7b3bacc5a4cc70dd9922c513ef" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.177277 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-76hpr" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.185614 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53e0412-9b5d-4f27-b90b-5b44d02d419b","Type":"ContainerStarted","Data":"98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0"} Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.185666 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53e0412-9b5d-4f27-b90b-5b44d02d419b","Type":"ContainerStarted","Data":"dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a"} Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.193830 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9tlx" event={"ID":"110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7","Type":"ContainerDied","Data":"9ede004d37e6be0c5fe5adf8305086a90b5c43ed82ad30ef92d763f1aa12506f"} Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.193876 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ede004d37e6be0c5fe5adf8305086a90b5c43ed82ad30ef92d763f1aa12506f" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.193925 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9tlx" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.195371 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.216533 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-749bf8fbcf-mc9c6" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.240407 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.240385521 podStartE2EDuration="5.240385521s" podCreationTimestamp="2026-03-18 12:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:13.23454951 +0000 UTC m=+1498.948950099" watchObservedRunningTime="2026-03-18 12:35:13.240385521 +0000 UTC m=+1498.954786100" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.569253 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.656897 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d4d4-account-create-update-bzbn7" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.664560 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9n6h\" (UniqueName: \"kubernetes.io/projected/48f1daf8-3604-40e4-9e41-e9025c083c7d-kube-api-access-t9n6h\") pod \"48f1daf8-3604-40e4-9e41-e9025c083c7d\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.664604 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-ovndb-tls-certs\") pod \"48f1daf8-3604-40e4-9e41-e9025c083c7d\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.664653 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-httpd-config\") pod \"48f1daf8-3604-40e4-9e41-e9025c083c7d\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.664712 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-config\") pod \"48f1daf8-3604-40e4-9e41-e9025c083c7d\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.664783 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-combined-ca-bundle\") pod \"48f1daf8-3604-40e4-9e41-e9025c083c7d\" (UID: \"48f1daf8-3604-40e4-9e41-e9025c083c7d\") " Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.669420 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "48f1daf8-3604-40e4-9e41-e9025c083c7d" (UID: "48f1daf8-3604-40e4-9e41-e9025c083c7d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.676097 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f1daf8-3604-40e4-9e41-e9025c083c7d-kube-api-access-t9n6h" (OuterVolumeSpecName: "kube-api-access-t9n6h") pod "48f1daf8-3604-40e4-9e41-e9025c083c7d" (UID: "48f1daf8-3604-40e4-9e41-e9025c083c7d"). InnerVolumeSpecName "kube-api-access-t9n6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.747063 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-config" (OuterVolumeSpecName: "config") pod "48f1daf8-3604-40e4-9e41-e9025c083c7d" (UID: "48f1daf8-3604-40e4-9e41-e9025c083c7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.754768 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48f1daf8-3604-40e4-9e41-e9025c083c7d" (UID: "48f1daf8-3604-40e4-9e41-e9025c083c7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.771215 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvzgf\" (UniqueName: \"kubernetes.io/projected/22cd6088-9800-4d88-b130-fdc6a3dd4e90-kube-api-access-zvzgf\") pod \"22cd6088-9800-4d88-b130-fdc6a3dd4e90\" (UID: \"22cd6088-9800-4d88-b130-fdc6a3dd4e90\") " Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.771446 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22cd6088-9800-4d88-b130-fdc6a3dd4e90-operator-scripts\") pod \"22cd6088-9800-4d88-b130-fdc6a3dd4e90\" (UID: \"22cd6088-9800-4d88-b130-fdc6a3dd4e90\") " Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.771854 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9n6h\" (UniqueName: \"kubernetes.io/projected/48f1daf8-3604-40e4-9e41-e9025c083c7d-kube-api-access-t9n6h\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.771887 4975 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.771898 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.771907 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.772270 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22cd6088-9800-4d88-b130-fdc6a3dd4e90-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22cd6088-9800-4d88-b130-fdc6a3dd4e90" (UID: "22cd6088-9800-4d88-b130-fdc6a3dd4e90"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.777053 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22cd6088-9800-4d88-b130-fdc6a3dd4e90-kube-api-access-zvzgf" (OuterVolumeSpecName: "kube-api-access-zvzgf") pod "22cd6088-9800-4d88-b130-fdc6a3dd4e90" (UID: "22cd6088-9800-4d88-b130-fdc6a3dd4e90"). InnerVolumeSpecName "kube-api-access-zvzgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.814078 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "48f1daf8-3604-40e4-9e41-e9025c083c7d" (UID: "48f1daf8-3604-40e4-9e41-e9025c083c7d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.874973 4975 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f1daf8-3604-40e4-9e41-e9025c083c7d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.875009 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22cd6088-9800-4d88-b130-fdc6a3dd4e90-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.875023 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvzgf\" (UniqueName: \"kubernetes.io/projected/22cd6088-9800-4d88-b130-fdc6a3dd4e90-kube-api-access-zvzgf\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.937081 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db13-account-create-update-cmgxb" Mar 18 12:35:13 crc kubenswrapper[4975]: I0318 12:35:13.940957 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c3a7-account-create-update-djb44" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.077580 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rs8l\" (UniqueName: \"kubernetes.io/projected/f3524634-0ee4-4add-a35b-a6ffdee6f1f6-kube-api-access-5rs8l\") pod \"f3524634-0ee4-4add-a35b-a6ffdee6f1f6\" (UID: \"f3524634-0ee4-4add-a35b-a6ffdee6f1f6\") " Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.077634 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3524634-0ee4-4add-a35b-a6ffdee6f1f6-operator-scripts\") pod \"f3524634-0ee4-4add-a35b-a6ffdee6f1f6\" (UID: \"f3524634-0ee4-4add-a35b-a6ffdee6f1f6\") " Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.077828 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/481acd0e-80e8-416a-aec5-361567fd5bc6-operator-scripts\") pod \"481acd0e-80e8-416a-aec5-361567fd5bc6\" (UID: \"481acd0e-80e8-416a-aec5-361567fd5bc6\") " Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.077914 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfn98\" (UniqueName: \"kubernetes.io/projected/481acd0e-80e8-416a-aec5-361567fd5bc6-kube-api-access-vfn98\") pod \"481acd0e-80e8-416a-aec5-361567fd5bc6\" (UID: \"481acd0e-80e8-416a-aec5-361567fd5bc6\") " Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.078367 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3524634-0ee4-4add-a35b-a6ffdee6f1f6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f3524634-0ee4-4add-a35b-a6ffdee6f1f6" (UID: "f3524634-0ee4-4add-a35b-a6ffdee6f1f6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.078398 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/481acd0e-80e8-416a-aec5-361567fd5bc6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "481acd0e-80e8-416a-aec5-361567fd5bc6" (UID: "481acd0e-80e8-416a-aec5-361567fd5bc6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.083149 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481acd0e-80e8-416a-aec5-361567fd5bc6-kube-api-access-vfn98" (OuterVolumeSpecName: "kube-api-access-vfn98") pod "481acd0e-80e8-416a-aec5-361567fd5bc6" (UID: "481acd0e-80e8-416a-aec5-361567fd5bc6"). InnerVolumeSpecName "kube-api-access-vfn98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.090948 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3524634-0ee4-4add-a35b-a6ffdee6f1f6-kube-api-access-5rs8l" (OuterVolumeSpecName: "kube-api-access-5rs8l") pod "f3524634-0ee4-4add-a35b-a6ffdee6f1f6" (UID: "f3524634-0ee4-4add-a35b-a6ffdee6f1f6"). InnerVolumeSpecName "kube-api-access-5rs8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.180674 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/481acd0e-80e8-416a-aec5-361567fd5bc6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.180716 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfn98\" (UniqueName: \"kubernetes.io/projected/481acd0e-80e8-416a-aec5-361567fd5bc6-kube-api-access-vfn98\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.180732 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rs8l\" (UniqueName: \"kubernetes.io/projected/f3524634-0ee4-4add-a35b-a6ffdee6f1f6-kube-api-access-5rs8l\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.180746 4975 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3524634-0ee4-4add-a35b-a6ffdee6f1f6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.202927 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d4d4-account-create-update-bzbn7" event={"ID":"22cd6088-9800-4d88-b130-fdc6a3dd4e90","Type":"ContainerDied","Data":"8c15543eb2ebc55e4ddbea7b3373a8b92a9dbbf543bdff5d3cfd5590d553db82"} Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.202952 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d4d4-account-create-update-bzbn7" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.202966 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c15543eb2ebc55e4ddbea7b3373a8b92a9dbbf543bdff5d3cfd5590d553db82" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.204668 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5899b99ff6-cwt84" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.204665 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5899b99ff6-cwt84" event={"ID":"48f1daf8-3604-40e4-9e41-e9025c083c7d","Type":"ContainerDied","Data":"120cebf7665e1c95d7ad66a6bc91fbfc2be309382735de46895f2b5631a2b1dc"} Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.204829 4975 scope.go:117] "RemoveContainer" containerID="54d62d06308503ce21e7991ebc2c1733e6936b83aaf6e8787b677c240f317b21" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.206092 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db13-account-create-update-cmgxb" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.206097 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db13-account-create-update-cmgxb" event={"ID":"481acd0e-80e8-416a-aec5-361567fd5bc6","Type":"ContainerDied","Data":"3ccd85d8bdcd1a3b5a42a441f947551a3f85e0acef25e04c7e9a021df14dd12e"} Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.206156 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ccd85d8bdcd1a3b5a42a441f947551a3f85e0acef25e04c7e9a021df14dd12e" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.231415 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c3a7-account-create-update-djb44" event={"ID":"f3524634-0ee4-4add-a35b-a6ffdee6f1f6","Type":"ContainerDied","Data":"217ec70c8b0b4cc913c2e64b2653c9f52a11ecbc5a6c56228d6c41eed542b924"} Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.231467 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="217ec70c8b0b4cc913c2e64b2653c9f52a11ecbc5a6c56228d6c41eed542b924" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.231489 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c3a7-account-create-update-djb44" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.247932 4975 scope.go:117] "RemoveContainer" containerID="589ca94554a2c2a06045888f589afc811ca3abd6286fee87ebd2dc6e376739cf" Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.263744 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5899b99ff6-cwt84"] Mar 18 12:35:14 crc kubenswrapper[4975]: I0318 12:35:14.270952 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5899b99ff6-cwt84"] Mar 18 12:35:15 crc kubenswrapper[4975]: I0318 12:35:15.026066 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f1daf8-3604-40e4-9e41-e9025c083c7d" path="/var/lib/kubelet/pods/48f1daf8-3604-40e4-9e41-e9025c083c7d/volumes" Mar 18 12:35:16 crc kubenswrapper[4975]: I0318 12:35:16.680801 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-c478d4794-x2t7q" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 18 12:35:17 crc kubenswrapper[4975]: I0318 12:35:17.270610 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53e0412-9b5d-4f27-b90b-5b44d02d419b","Type":"ContainerStarted","Data":"16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b"} Mar 18 12:35:17 crc kubenswrapper[4975]: I0318 12:35:17.270943 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="ceilometer-central-agent" containerID="cri-o://af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0" gracePeriod=30 Mar 18 12:35:17 crc kubenswrapper[4975]: I0318 12:35:17.271048 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:35:17 crc kubenswrapper[4975]: I0318 12:35:17.271108 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="proxy-httpd" containerID="cri-o://16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b" gracePeriod=30 Mar 18 12:35:17 crc kubenswrapper[4975]: I0318 12:35:17.271185 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="ceilometer-notification-agent" containerID="cri-o://dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a" gracePeriod=30 Mar 18 12:35:17 crc kubenswrapper[4975]: I0318 12:35:17.271261 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="sg-core" containerID="cri-o://98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0" gracePeriod=30 Mar 18 12:35:17 crc kubenswrapper[4975]: I0318 12:35:17.305152 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.825440348 podStartE2EDuration="9.305135613s" podCreationTimestamp="2026-03-18 12:35:08 +0000 UTC" firstStartedPulling="2026-03-18 12:35:09.358716922 +0000 UTC m=+1495.073117501" lastFinishedPulling="2026-03-18 12:35:15.838412187 +0000 UTC m=+1501.552812766" observedRunningTime="2026-03-18 12:35:17.299795716 +0000 UTC m=+1503.014196305" watchObservedRunningTime="2026-03-18 12:35:17.305135613 +0000 UTC m=+1503.019536192" Mar 18 12:35:17 crc kubenswrapper[4975]: E0318 12:35:17.492548 4975 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf53e0412_9b5d_4f27_b90b_5b44d02d419b.slice/crio-conmon-16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf53e0412_9b5d_4f27_b90b_5b44d02d419b.slice/crio-conmon-98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.021800 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x74s8"] Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.022580 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481acd0e-80e8-416a-aec5-361567fd5bc6" containerName="mariadb-account-create-update" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.022598 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="481acd0e-80e8-416a-aec5-361567fd5bc6" containerName="mariadb-account-create-update" Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.022614 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7" containerName="mariadb-database-create" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.022620 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7" containerName="mariadb-database-create" Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.022631 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f1daf8-3604-40e4-9e41-e9025c083c7d" containerName="neutron-api" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.022637 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f1daf8-3604-40e4-9e41-e9025c083c7d" containerName="neutron-api" Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.022647 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb70bc5f-531b-4c58-b84b-e8eb61d81340" containerName="mariadb-database-create" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.022656 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb70bc5f-531b-4c58-b84b-e8eb61d81340" containerName="mariadb-database-create" Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.022678 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3524634-0ee4-4add-a35b-a6ffdee6f1f6" containerName="mariadb-account-create-update" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.022688 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3524634-0ee4-4add-a35b-a6ffdee6f1f6" containerName="mariadb-account-create-update" Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.022709 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f1daf8-3604-40e4-9e41-e9025c083c7d" containerName="neutron-httpd" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.022716 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f1daf8-3604-40e4-9e41-e9025c083c7d" containerName="neutron-httpd" Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.022731 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22cd6088-9800-4d88-b130-fdc6a3dd4e90" containerName="mariadb-account-create-update" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.022738 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="22cd6088-9800-4d88-b130-fdc6a3dd4e90" containerName="mariadb-account-create-update" Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.022764 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7b0dc3-44d1-4932-bca8-f4ade944ecbe" containerName="mariadb-database-create" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.022772 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7b0dc3-44d1-4932-bca8-f4ade944ecbe" containerName="mariadb-database-create" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.022962 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f1daf8-3604-40e4-9e41-e9025c083c7d" containerName="neutron-httpd" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.022973 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb70bc5f-531b-4c58-b84b-e8eb61d81340" containerName="mariadb-database-create" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.022981 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f1daf8-3604-40e4-9e41-e9025c083c7d" containerName="neutron-api" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.022987 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f7b0dc3-44d1-4932-bca8-f4ade944ecbe" containerName="mariadb-database-create" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.022996 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="481acd0e-80e8-416a-aec5-361567fd5bc6" containerName="mariadb-account-create-update" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.023006 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7" containerName="mariadb-database-create" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.023018 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="22cd6088-9800-4d88-b130-fdc6a3dd4e90" containerName="mariadb-account-create-update" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.023031 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3524634-0ee4-4add-a35b-a6ffdee6f1f6" containerName="mariadb-account-create-update" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.023696 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.025447 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.027216 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-w7sgq" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.027380 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.030673 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x74s8"] Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.115394 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.153555 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x74s8\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.153613 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcbdq\" (UniqueName: \"kubernetes.io/projected/67cbe265-b297-4ad8-af53-703b8549d8e6-kube-api-access-zcbdq\") pod \"nova-cell0-conductor-db-sync-x74s8\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.153661 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-scripts\") pod \"nova-cell0-conductor-db-sync-x74s8\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.153761 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-config-data\") pod \"nova-cell0-conductor-db-sync-x74s8\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.255323 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-scripts\") pod \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.255387 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmtfj\" (UniqueName: \"kubernetes.io/projected/f53e0412-9b5d-4f27-b90b-5b44d02d419b-kube-api-access-xmtfj\") pod \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.255437 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53e0412-9b5d-4f27-b90b-5b44d02d419b-log-httpd\") pod \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.255500 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-config-data\") pod \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.255557 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-sg-core-conf-yaml\") pod \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.255601 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-combined-ca-bundle\") pod \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.255646 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53e0412-9b5d-4f27-b90b-5b44d02d419b-run-httpd\") pod \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\" (UID: \"f53e0412-9b5d-4f27-b90b-5b44d02d419b\") " Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.255919 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x74s8\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.255965 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcbdq\" (UniqueName: \"kubernetes.io/projected/67cbe265-b297-4ad8-af53-703b8549d8e6-kube-api-access-zcbdq\") pod \"nova-cell0-conductor-db-sync-x74s8\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.255998 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-scripts\") pod \"nova-cell0-conductor-db-sync-x74s8\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.256073 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-config-data\") pod \"nova-cell0-conductor-db-sync-x74s8\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.257458 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f53e0412-9b5d-4f27-b90b-5b44d02d419b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f53e0412-9b5d-4f27-b90b-5b44d02d419b" (UID: "f53e0412-9b5d-4f27-b90b-5b44d02d419b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.257599 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f53e0412-9b5d-4f27-b90b-5b44d02d419b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f53e0412-9b5d-4f27-b90b-5b44d02d419b" (UID: "f53e0412-9b5d-4f27-b90b-5b44d02d419b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.263070 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-scripts" (OuterVolumeSpecName: "scripts") pod "f53e0412-9b5d-4f27-b90b-5b44d02d419b" (UID: "f53e0412-9b5d-4f27-b90b-5b44d02d419b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.263111 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53e0412-9b5d-4f27-b90b-5b44d02d419b-kube-api-access-xmtfj" (OuterVolumeSpecName: "kube-api-access-xmtfj") pod "f53e0412-9b5d-4f27-b90b-5b44d02d419b" (UID: "f53e0412-9b5d-4f27-b90b-5b44d02d419b"). InnerVolumeSpecName "kube-api-access-xmtfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.263597 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-scripts\") pod \"nova-cell0-conductor-db-sync-x74s8\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.263849 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-config-data\") pod \"nova-cell0-conductor-db-sync-x74s8\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.264097 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x74s8\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.276177 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcbdq\" (UniqueName: \"kubernetes.io/projected/67cbe265-b297-4ad8-af53-703b8549d8e6-kube-api-access-zcbdq\") pod \"nova-cell0-conductor-db-sync-x74s8\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.294472 4975 generic.go:334] "Generic (PLEG): container finished" podID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerID="16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b" exitCode=0 Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.294512 4975 generic.go:334] "Generic (PLEG): container finished" podID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerID="98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0" exitCode=2 Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.294522 4975 generic.go:334] "Generic (PLEG): container finished" podID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerID="dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a" exitCode=0 Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.294535 4975 generic.go:334] "Generic (PLEG): container finished" podID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerID="af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0" exitCode=0 Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.294561 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53e0412-9b5d-4f27-b90b-5b44d02d419b","Type":"ContainerDied","Data":"16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b"} Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.294597 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53e0412-9b5d-4f27-b90b-5b44d02d419b","Type":"ContainerDied","Data":"98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0"} Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.294616 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53e0412-9b5d-4f27-b90b-5b44d02d419b","Type":"ContainerDied","Data":"dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a"} Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.294629 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53e0412-9b5d-4f27-b90b-5b44d02d419b","Type":"ContainerDied","Data":"af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0"} Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.294643 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53e0412-9b5d-4f27-b90b-5b44d02d419b","Type":"ContainerDied","Data":"fbec5e2d22a9bc37f9d2b32114f1de9aee694031b37f787d39e0d97efc709237"} Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.294663 4975 scope.go:117] "RemoveContainer" containerID="16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.294986 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.295537 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f53e0412-9b5d-4f27-b90b-5b44d02d419b" (UID: "f53e0412-9b5d-4f27-b90b-5b44d02d419b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.325282 4975 scope.go:117] "RemoveContainer" containerID="98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.345625 4975 scope.go:117] "RemoveContainer" containerID="dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.346058 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.351438 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f53e0412-9b5d-4f27-b90b-5b44d02d419b" (UID: "f53e0412-9b5d-4f27-b90b-5b44d02d419b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.357809 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.357850 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmtfj\" (UniqueName: \"kubernetes.io/projected/f53e0412-9b5d-4f27-b90b-5b44d02d419b-kube-api-access-xmtfj\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.357881 4975 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53e0412-9b5d-4f27-b90b-5b44d02d419b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.357894 4975 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.357905 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.357915 4975 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53e0412-9b5d-4f27-b90b-5b44d02d419b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.387041 4975 scope.go:117] "RemoveContainer" containerID="af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.387824 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-config-data" (OuterVolumeSpecName: "config-data") pod "f53e0412-9b5d-4f27-b90b-5b44d02d419b" (UID: "f53e0412-9b5d-4f27-b90b-5b44d02d419b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.459818 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53e0412-9b5d-4f27-b90b-5b44d02d419b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.534436 4975 scope.go:117] "RemoveContainer" containerID="16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b" Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.538780 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b\": container with ID starting with 16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b not found: ID does not exist" containerID="16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.538831 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b"} err="failed to get container status \"16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b\": rpc error: code = NotFound desc = could not find container \"16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b\": container with ID starting with 16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.538879 4975 scope.go:117] "RemoveContainer" containerID="98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0" Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.539417 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0\": container with ID starting with 98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0 not found: ID does not exist" containerID="98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.539450 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0"} err="failed to get container status \"98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0\": rpc error: code = NotFound desc = could not find container \"98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0\": container with ID starting with 98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0 not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.539470 4975 scope.go:117] "RemoveContainer" containerID="dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a" Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.539798 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a\": container with ID starting with dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a not found: ID does not exist" containerID="dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.539843 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a"} err="failed to get container status \"dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a\": rpc error: code = NotFound desc = could not find container \"dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a\": container with ID starting with dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.539877 4975 scope.go:117] "RemoveContainer" containerID="af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0" Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.540307 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0\": container with ID starting with af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0 not found: ID does not exist" containerID="af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.540362 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0"} err="failed to get container status \"af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0\": rpc error: code = NotFound desc = could not find container \"af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0\": container with ID starting with af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0 not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.540399 4975 scope.go:117] "RemoveContainer" containerID="16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.540784 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b"} err="failed to get container status \"16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b\": rpc error: code = NotFound desc = could not find container \"16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b\": container with ID starting with 16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.540813 4975 scope.go:117] "RemoveContainer" containerID="98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.541948 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0"} err="failed to get container status \"98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0\": rpc error: code = NotFound desc = could not find container \"98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0\": container with ID starting with 98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0 not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.541983 4975 scope.go:117] "RemoveContainer" containerID="dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.544354 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a"} err="failed to get container status \"dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a\": rpc error: code = NotFound desc = could not find container \"dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a\": container with ID starting with dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.544403 4975 scope.go:117] "RemoveContainer" containerID="af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.544757 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0"} err="failed to get container status \"af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0\": rpc error: code = NotFound desc = could not find container \"af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0\": container with ID starting with af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0 not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.544798 4975 scope.go:117] "RemoveContainer" containerID="16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.545047 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b"} err="failed to get container status \"16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b\": rpc error: code = NotFound desc = could not find container \"16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b\": container with ID starting with 16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.545126 4975 scope.go:117] "RemoveContainer" containerID="98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.545759 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0"} err="failed to get container status \"98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0\": rpc error: code = NotFound desc = could not find container \"98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0\": container with ID starting with 98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0 not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.545785 4975 scope.go:117] "RemoveContainer" containerID="dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.546168 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a"} err="failed to get container status \"dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a\": rpc error: code = NotFound desc = could not find container \"dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a\": container with ID starting with dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.546184 4975 scope.go:117] "RemoveContainer" containerID="af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.546559 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0"} err="failed to get container status \"af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0\": rpc error: code = NotFound desc = could not find container \"af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0\": container with ID starting with af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0 not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.546576 4975 scope.go:117] "RemoveContainer" containerID="16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.547607 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b"} err="failed to get container status \"16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b\": rpc error: code = NotFound desc = could not find container \"16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b\": container with ID starting with 16b944505fbd5026b89b199df101457b7c10af09f13eb34634c49d0f4edae27b not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.547635 4975 scope.go:117] "RemoveContainer" containerID="98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.548123 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0"} err="failed to get container status \"98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0\": rpc error: code = NotFound desc = could not find container \"98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0\": container with ID starting with 98c69c8f47ffa7b026055792eb4f618bd0b7ea258e9bf9cd1c7a6e2f0fe9f7d0 not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.548163 4975 scope.go:117] "RemoveContainer" containerID="dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.548514 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a"} err="failed to get container status \"dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a\": rpc error: code = NotFound desc = could not find container \"dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a\": container with ID starting with dff8a334dd8356755353c874bfaf961f48cf7dc16afb98cbb93aa0fde8761f8a not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.548568 4975 scope.go:117] "RemoveContainer" containerID="af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.548931 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0"} err="failed to get container status \"af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0\": rpc error: code = NotFound desc = could not find container \"af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0\": container with ID starting with af0b3780d7a4c7500ebe25a96f9d91df47ae09d5d6516d444732507912ef58d0 not found: ID does not exist" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.639577 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.650575 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.671613 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.672101 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="ceilometer-notification-agent" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.672117 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="ceilometer-notification-agent" Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.672132 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="ceilometer-central-agent" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.672141 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="ceilometer-central-agent" Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.672164 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="proxy-httpd" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.672173 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="proxy-httpd" Mar 18 12:35:18 crc kubenswrapper[4975]: E0318 12:35:18.672195 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="sg-core" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.672202 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="sg-core" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.672429 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="sg-core" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.672452 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="ceilometer-notification-agent" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.672470 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="proxy-httpd" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.672480 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" containerName="ceilometer-central-agent" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.675787 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.678538 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.678822 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.686545 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.764638 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-config-data\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.764742 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51cbbbd2-2995-436e-bc62-da9a867a2db4-log-httpd\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.764816 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.765061 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mcmr\" (UniqueName: \"kubernetes.io/projected/51cbbbd2-2995-436e-bc62-da9a867a2db4-kube-api-access-4mcmr\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.765155 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51cbbbd2-2995-436e-bc62-da9a867a2db4-run-httpd\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.765208 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.765264 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-scripts\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.867197 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51cbbbd2-2995-436e-bc62-da9a867a2db4-log-httpd\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.867268 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.867322 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mcmr\" (UniqueName: \"kubernetes.io/projected/51cbbbd2-2995-436e-bc62-da9a867a2db4-kube-api-access-4mcmr\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.867348 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51cbbbd2-2995-436e-bc62-da9a867a2db4-run-httpd\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.867374 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.867404 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-scripts\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.867433 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-config-data\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.868702 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51cbbbd2-2995-436e-bc62-da9a867a2db4-run-httpd\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.868740 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51cbbbd2-2995-436e-bc62-da9a867a2db4-log-httpd\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.873203 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.873272 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-config-data\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.877732 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.877898 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-scripts\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.892096 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x74s8"] Mar 18 12:35:18 crc kubenswrapper[4975]: I0318 12:35:18.897604 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mcmr\" (UniqueName: \"kubernetes.io/projected/51cbbbd2-2995-436e-bc62-da9a867a2db4-kube-api-access-4mcmr\") pod \"ceilometer-0\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " pod="openstack/ceilometer-0" Mar 18 12:35:19 crc kubenswrapper[4975]: I0318 12:35:19.002368 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:19 crc kubenswrapper[4975]: I0318 12:35:19.029052 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f53e0412-9b5d-4f27-b90b-5b44d02d419b" path="/var/lib/kubelet/pods/f53e0412-9b5d-4f27-b90b-5b44d02d419b/volumes" Mar 18 12:35:19 crc kubenswrapper[4975]: I0318 12:35:19.304167 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x74s8" event={"ID":"67cbe265-b297-4ad8-af53-703b8549d8e6","Type":"ContainerStarted","Data":"ab3f4063895d5968d765012b823cd6187efffb41ed9054da02757c8d68321910"} Mar 18 12:35:19 crc kubenswrapper[4975]: I0318 12:35:19.437161 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:20 crc kubenswrapper[4975]: I0318 12:35:20.330722 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51cbbbd2-2995-436e-bc62-da9a867a2db4","Type":"ContainerStarted","Data":"3ba05def4f11d46a0bcd40dccd7d60ecff260d1234ab9b338a3e8ca74c12a5c5"} Mar 18 12:35:21 crc kubenswrapper[4975]: I0318 12:35:21.044895 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:35:21 crc kubenswrapper[4975]: I0318 12:35:21.108368 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55d747f656-4fh7c" Mar 18 12:35:21 crc kubenswrapper[4975]: I0318 12:35:21.206463 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-587979d76d-qg8cs"] Mar 18 12:35:21 crc kubenswrapper[4975]: I0318 12:35:21.206750 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-587979d76d-qg8cs" podUID="f967c09a-49f5-4a4c-a1a9-7bb2da157132" containerName="placement-log" containerID="cri-o://458781d0a4bccfcc7f57ea793253b6177a97e2b88c696641ce09fe6e28923fc2" gracePeriod=30 Mar 18 12:35:21 crc kubenswrapper[4975]: I0318 12:35:21.206851 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-587979d76d-qg8cs" podUID="f967c09a-49f5-4a4c-a1a9-7bb2da157132" containerName="placement-api" containerID="cri-o://4b03eed16205c65408f7dc224e2fd5897758a55b75be64f724d52b8b56190a6d" gracePeriod=30 Mar 18 12:35:21 crc kubenswrapper[4975]: I0318 12:35:21.349395 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51cbbbd2-2995-436e-bc62-da9a867a2db4","Type":"ContainerStarted","Data":"d05da0446a9ba507831161d869ce384dbcbd26e02423ca54916694bf90eca97b"} Mar 18 12:35:21 crc kubenswrapper[4975]: I0318 12:35:21.352140 4975 generic.go:334] "Generic (PLEG): container finished" podID="f967c09a-49f5-4a4c-a1a9-7bb2da157132" containerID="458781d0a4bccfcc7f57ea793253b6177a97e2b88c696641ce09fe6e28923fc2" exitCode=143 Mar 18 12:35:21 crc kubenswrapper[4975]: I0318 12:35:21.353398 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-587979d76d-qg8cs" event={"ID":"f967c09a-49f5-4a4c-a1a9-7bb2da157132","Type":"ContainerDied","Data":"458781d0a4bccfcc7f57ea793253b6177a97e2b88c696641ce09fe6e28923fc2"} Mar 18 12:35:21 crc kubenswrapper[4975]: I0318 12:35:21.879569 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 12:35:21 crc kubenswrapper[4975]: I0318 12:35:21.944708 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.371400 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51cbbbd2-2995-436e-bc62-da9a867a2db4","Type":"ContainerStarted","Data":"9e42adc1d7526260d3e1f5be111a614b71c9b47d9bbf36e86d0cdb8283513e0f"} Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.374921 4975 generic.go:334] "Generic (PLEG): container finished" podID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerID="6b28ba84cab9eb6dd833f8a2c43c58ed4371a9c34e96c677b896512ab2e431d3" exitCode=137 Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.374965 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c478d4794-x2t7q" event={"ID":"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7","Type":"ContainerDied","Data":"6b28ba84cab9eb6dd833f8a2c43c58ed4371a9c34e96c677b896512ab2e431d3"} Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.375001 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c478d4794-x2t7q" event={"ID":"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7","Type":"ContainerDied","Data":"d6be8c15f83546a47280b98be94796f59aef29b9de4dc0a10bfdb7458543f3a5"} Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.375014 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6be8c15f83546a47280b98be94796f59aef29b9de4dc0a10bfdb7458543f3a5" Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.393239 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.545760 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-combined-ca-bundle\") pod \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.545832 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-horizon-tls-certs\") pod \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.545998 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sngk5\" (UniqueName: \"kubernetes.io/projected/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-kube-api-access-sngk5\") pod \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.546458 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-horizon-secret-key\") pod \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.546983 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-config-data\") pod \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.547104 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-scripts\") pod \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.547163 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-logs\") pod \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\" (UID: \"ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7\") " Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.552799 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-logs" (OuterVolumeSpecName: "logs") pod "ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" (UID: "ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.553551 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" (UID: "ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.566163 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-kube-api-access-sngk5" (OuterVolumeSpecName: "kube-api-access-sngk5") pod "ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" (UID: "ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7"). InnerVolumeSpecName "kube-api-access-sngk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.596978 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" (UID: "ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.599858 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-config-data" (OuterVolumeSpecName: "config-data") pod "ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" (UID: "ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.600353 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-scripts" (OuterVolumeSpecName: "scripts") pod "ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" (UID: "ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.620002 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" (UID: "ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.652051 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.652091 4975 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.652104 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.652114 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sngk5\" (UniqueName: \"kubernetes.io/projected/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-kube-api-access-sngk5\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.652125 4975 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.652134 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:22 crc kubenswrapper[4975]: I0318 12:35:22.652144 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:23 crc kubenswrapper[4975]: I0318 12:35:23.386160 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c478d4794-x2t7q" Mar 18 12:35:23 crc kubenswrapper[4975]: I0318 12:35:23.386352 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51cbbbd2-2995-436e-bc62-da9a867a2db4","Type":"ContainerStarted","Data":"d1f030d91c85444f92a522a15d28c5a82eb0dcac918ab12f7da501804d6e1e18"} Mar 18 12:35:23 crc kubenswrapper[4975]: I0318 12:35:23.410707 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c478d4794-x2t7q"] Mar 18 12:35:23 crc kubenswrapper[4975]: I0318 12:35:23.418085 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-c478d4794-x2t7q"] Mar 18 12:35:25 crc kubenswrapper[4975]: I0318 12:35:25.028660 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" path="/var/lib/kubelet/pods/ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7/volumes" Mar 18 12:35:25 crc kubenswrapper[4975]: I0318 12:35:25.410604 4975 generic.go:334] "Generic (PLEG): container finished" podID="f967c09a-49f5-4a4c-a1a9-7bb2da157132" containerID="4b03eed16205c65408f7dc224e2fd5897758a55b75be64f724d52b8b56190a6d" exitCode=0 Mar 18 12:35:25 crc kubenswrapper[4975]: I0318 12:35:25.410666 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-587979d76d-qg8cs" event={"ID":"f967c09a-49f5-4a4c-a1a9-7bb2da157132","Type":"ContainerDied","Data":"4b03eed16205c65408f7dc224e2fd5897758a55b75be64f724d52b8b56190a6d"} Mar 18 12:35:25 crc kubenswrapper[4975]: I0318 12:35:25.539144 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:35:25 crc kubenswrapper[4975]: I0318 12:35:25.539197 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.185500 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.277957 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-config-data\") pod \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.278086 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-public-tls-certs\") pod \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.278117 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-combined-ca-bundle\") pod \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.278150 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-internal-tls-certs\") pod \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.278178 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-scripts\") pod \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.278295 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f967c09a-49f5-4a4c-a1a9-7bb2da157132-logs\") pod \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.278343 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qxc9\" (UniqueName: \"kubernetes.io/projected/f967c09a-49f5-4a4c-a1a9-7bb2da157132-kube-api-access-9qxc9\") pod \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\" (UID: \"f967c09a-49f5-4a4c-a1a9-7bb2da157132\") " Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.279221 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f967c09a-49f5-4a4c-a1a9-7bb2da157132-logs" (OuterVolumeSpecName: "logs") pod "f967c09a-49f5-4a4c-a1a9-7bb2da157132" (UID: "f967c09a-49f5-4a4c-a1a9-7bb2da157132"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.282903 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f967c09a-49f5-4a4c-a1a9-7bb2da157132-kube-api-access-9qxc9" (OuterVolumeSpecName: "kube-api-access-9qxc9") pod "f967c09a-49f5-4a4c-a1a9-7bb2da157132" (UID: "f967c09a-49f5-4a4c-a1a9-7bb2da157132"). InnerVolumeSpecName "kube-api-access-9qxc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.283062 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-scripts" (OuterVolumeSpecName: "scripts") pod "f967c09a-49f5-4a4c-a1a9-7bb2da157132" (UID: "f967c09a-49f5-4a4c-a1a9-7bb2da157132"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.327310 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f967c09a-49f5-4a4c-a1a9-7bb2da157132" (UID: "f967c09a-49f5-4a4c-a1a9-7bb2da157132"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.330409 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-config-data" (OuterVolumeSpecName: "config-data") pod "f967c09a-49f5-4a4c-a1a9-7bb2da157132" (UID: "f967c09a-49f5-4a4c-a1a9-7bb2da157132"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.377482 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f967c09a-49f5-4a4c-a1a9-7bb2da157132" (UID: "f967c09a-49f5-4a4c-a1a9-7bb2da157132"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.380214 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.380246 4975 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.380255 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.380284 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f967c09a-49f5-4a4c-a1a9-7bb2da157132-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.380293 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qxc9\" (UniqueName: \"kubernetes.io/projected/f967c09a-49f5-4a4c-a1a9-7bb2da157132-kube-api-access-9qxc9\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.380306 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.382081 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f967c09a-49f5-4a4c-a1a9-7bb2da157132" (UID: "f967c09a-49f5-4a4c-a1a9-7bb2da157132"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.448189 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-587979d76d-qg8cs" event={"ID":"f967c09a-49f5-4a4c-a1a9-7bb2da157132","Type":"ContainerDied","Data":"b7989a097f0537c61ad811f0ed4617544ebc5ab98ef3691949caa1c18fb9812d"} Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.448241 4975 scope.go:117] "RemoveContainer" containerID="4b03eed16205c65408f7dc224e2fd5897758a55b75be64f724d52b8b56190a6d" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.448197 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-587979d76d-qg8cs" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.450953 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51cbbbd2-2995-436e-bc62-da9a867a2db4","Type":"ContainerStarted","Data":"b2a95c4619c0a6f8d66ab0a94214574b9ba89530c8e4a569a274fbe5be88e1eb"} Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.451035 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="ceilometer-central-agent" containerID="cri-o://d05da0446a9ba507831161d869ce384dbcbd26e02423ca54916694bf90eca97b" gracePeriod=30 Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.451057 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.451084 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="sg-core" containerID="cri-o://d1f030d91c85444f92a522a15d28c5a82eb0dcac918ab12f7da501804d6e1e18" gracePeriod=30 Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.451109 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="proxy-httpd" containerID="cri-o://b2a95c4619c0a6f8d66ab0a94214574b9ba89530c8e4a569a274fbe5be88e1eb" gracePeriod=30 Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.451098 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="ceilometer-notification-agent" containerID="cri-o://9e42adc1d7526260d3e1f5be111a614b71c9b47d9bbf36e86d0cdb8283513e0f" gracePeriod=30 Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.456770 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x74s8" event={"ID":"67cbe265-b297-4ad8-af53-703b8549d8e6","Type":"ContainerStarted","Data":"2027ac0f879d6bc658160a364d83c9822493eb86828b9ea3300924d8c2837379"} Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.484810 4975 scope.go:117] "RemoveContainer" containerID="458781d0a4bccfcc7f57ea793253b6177a97e2b88c696641ce09fe6e28923fc2" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.486693 4975 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f967c09a-49f5-4a4c-a1a9-7bb2da157132-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.492057 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.945142715 podStartE2EDuration="11.492009415s" podCreationTimestamp="2026-03-18 12:35:18 +0000 UTC" firstStartedPulling="2026-03-18 12:35:19.439049616 +0000 UTC m=+1505.153450195" lastFinishedPulling="2026-03-18 12:35:28.985916316 +0000 UTC m=+1514.700316895" observedRunningTime="2026-03-18 12:35:29.472768875 +0000 UTC m=+1515.187169454" watchObservedRunningTime="2026-03-18 12:35:29.492009415 +0000 UTC m=+1515.206409994" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.514797 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-x74s8" podStartSLOduration=2.4216489660000002 podStartE2EDuration="12.514747711s" podCreationTimestamp="2026-03-18 12:35:17 +0000 UTC" firstStartedPulling="2026-03-18 12:35:18.894013434 +0000 UTC m=+1504.608414013" lastFinishedPulling="2026-03-18 12:35:28.987112179 +0000 UTC m=+1514.701512758" observedRunningTime="2026-03-18 12:35:29.494651188 +0000 UTC m=+1515.209051767" watchObservedRunningTime="2026-03-18 12:35:29.514747711 +0000 UTC m=+1515.229148290" Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.532262 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-587979d76d-qg8cs"] Mar 18 12:35:29 crc kubenswrapper[4975]: I0318 12:35:29.540694 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-587979d76d-qg8cs"] Mar 18 12:35:30 crc kubenswrapper[4975]: I0318 12:35:30.470056 4975 generic.go:334] "Generic (PLEG): container finished" podID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerID="b2a95c4619c0a6f8d66ab0a94214574b9ba89530c8e4a569a274fbe5be88e1eb" exitCode=0 Mar 18 12:35:30 crc kubenswrapper[4975]: I0318 12:35:30.470346 4975 generic.go:334] "Generic (PLEG): container finished" podID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerID="d1f030d91c85444f92a522a15d28c5a82eb0dcac918ab12f7da501804d6e1e18" exitCode=2 Mar 18 12:35:30 crc kubenswrapper[4975]: I0318 12:35:30.470360 4975 generic.go:334] "Generic (PLEG): container finished" podID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerID="9e42adc1d7526260d3e1f5be111a614b71c9b47d9bbf36e86d0cdb8283513e0f" exitCode=0 Mar 18 12:35:30 crc kubenswrapper[4975]: I0318 12:35:30.470207 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51cbbbd2-2995-436e-bc62-da9a867a2db4","Type":"ContainerDied","Data":"b2a95c4619c0a6f8d66ab0a94214574b9ba89530c8e4a569a274fbe5be88e1eb"} Mar 18 12:35:30 crc kubenswrapper[4975]: I0318 12:35:30.470452 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51cbbbd2-2995-436e-bc62-da9a867a2db4","Type":"ContainerDied","Data":"d1f030d91c85444f92a522a15d28c5a82eb0dcac918ab12f7da501804d6e1e18"} Mar 18 12:35:30 crc kubenswrapper[4975]: I0318 12:35:30.470468 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51cbbbd2-2995-436e-bc62-da9a867a2db4","Type":"ContainerDied","Data":"9e42adc1d7526260d3e1f5be111a614b71c9b47d9bbf36e86d0cdb8283513e0f"} Mar 18 12:35:31 crc kubenswrapper[4975]: I0318 12:35:31.028404 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f967c09a-49f5-4a4c-a1a9-7bb2da157132" path="/var/lib/kubelet/pods/f967c09a-49f5-4a4c-a1a9-7bb2da157132/volumes" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.020824 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.139147 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-config-data\") pod \"51cbbbd2-2995-436e-bc62-da9a867a2db4\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.139197 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-scripts\") pod \"51cbbbd2-2995-436e-bc62-da9a867a2db4\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.139220 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-sg-core-conf-yaml\") pod \"51cbbbd2-2995-436e-bc62-da9a867a2db4\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.139281 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51cbbbd2-2995-436e-bc62-da9a867a2db4-run-httpd\") pod \"51cbbbd2-2995-436e-bc62-da9a867a2db4\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.139310 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51cbbbd2-2995-436e-bc62-da9a867a2db4-log-httpd\") pod \"51cbbbd2-2995-436e-bc62-da9a867a2db4\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.139473 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mcmr\" (UniqueName: \"kubernetes.io/projected/51cbbbd2-2995-436e-bc62-da9a867a2db4-kube-api-access-4mcmr\") pod \"51cbbbd2-2995-436e-bc62-da9a867a2db4\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.139524 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-combined-ca-bundle\") pod \"51cbbbd2-2995-436e-bc62-da9a867a2db4\" (UID: \"51cbbbd2-2995-436e-bc62-da9a867a2db4\") " Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.139740 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51cbbbd2-2995-436e-bc62-da9a867a2db4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "51cbbbd2-2995-436e-bc62-da9a867a2db4" (UID: "51cbbbd2-2995-436e-bc62-da9a867a2db4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.140665 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51cbbbd2-2995-436e-bc62-da9a867a2db4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "51cbbbd2-2995-436e-bc62-da9a867a2db4" (UID: "51cbbbd2-2995-436e-bc62-da9a867a2db4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.147673 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-scripts" (OuterVolumeSpecName: "scripts") pod "51cbbbd2-2995-436e-bc62-da9a867a2db4" (UID: "51cbbbd2-2995-436e-bc62-da9a867a2db4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.154250 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51cbbbd2-2995-436e-bc62-da9a867a2db4-kube-api-access-4mcmr" (OuterVolumeSpecName: "kube-api-access-4mcmr") pod "51cbbbd2-2995-436e-bc62-da9a867a2db4" (UID: "51cbbbd2-2995-436e-bc62-da9a867a2db4"). InnerVolumeSpecName "kube-api-access-4mcmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.175912 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "51cbbbd2-2995-436e-bc62-da9a867a2db4" (UID: "51cbbbd2-2995-436e-bc62-da9a867a2db4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.220051 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51cbbbd2-2995-436e-bc62-da9a867a2db4" (UID: "51cbbbd2-2995-436e-bc62-da9a867a2db4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.241519 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.241563 4975 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.241577 4975 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51cbbbd2-2995-436e-bc62-da9a867a2db4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.241588 4975 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51cbbbd2-2995-436e-bc62-da9a867a2db4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.241600 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mcmr\" (UniqueName: \"kubernetes.io/projected/51cbbbd2-2995-436e-bc62-da9a867a2db4-kube-api-access-4mcmr\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.241609 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.244460 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-config-data" (OuterVolumeSpecName: "config-data") pod "51cbbbd2-2995-436e-bc62-da9a867a2db4" (UID: "51cbbbd2-2995-436e-bc62-da9a867a2db4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.343473 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51cbbbd2-2995-436e-bc62-da9a867a2db4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.490162 4975 generic.go:334] "Generic (PLEG): container finished" podID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerID="d05da0446a9ba507831161d869ce384dbcbd26e02423ca54916694bf90eca97b" exitCode=0 Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.490217 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51cbbbd2-2995-436e-bc62-da9a867a2db4","Type":"ContainerDied","Data":"d05da0446a9ba507831161d869ce384dbcbd26e02423ca54916694bf90eca97b"} Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.490252 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51cbbbd2-2995-436e-bc62-da9a867a2db4","Type":"ContainerDied","Data":"3ba05def4f11d46a0bcd40dccd7d60ecff260d1234ab9b338a3e8ca74c12a5c5"} Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.490273 4975 scope.go:117] "RemoveContainer" containerID="b2a95c4619c0a6f8d66ab0a94214574b9ba89530c8e4a569a274fbe5be88e1eb" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.490221 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.521697 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.526658 4975 scope.go:117] "RemoveContainer" containerID="d1f030d91c85444f92a522a15d28c5a82eb0dcac918ab12f7da501804d6e1e18" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.551013 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.581805 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:32 crc kubenswrapper[4975]: E0318 12:35:32.582315 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerName="horizon-log" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582342 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerName="horizon-log" Mar 18 12:35:32 crc kubenswrapper[4975]: E0318 12:35:32.582354 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerName="horizon" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582374 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerName="horizon" Mar 18 12:35:32 crc kubenswrapper[4975]: E0318 12:35:32.582396 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="ceilometer-central-agent" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582406 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="ceilometer-central-agent" Mar 18 12:35:32 crc kubenswrapper[4975]: E0318 12:35:32.582425 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f967c09a-49f5-4a4c-a1a9-7bb2da157132" containerName="placement-api" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582433 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f967c09a-49f5-4a4c-a1a9-7bb2da157132" containerName="placement-api" Mar 18 12:35:32 crc kubenswrapper[4975]: E0318 12:35:32.582450 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="ceilometer-notification-agent" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582460 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="ceilometer-notification-agent" Mar 18 12:35:32 crc kubenswrapper[4975]: E0318 12:35:32.582472 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="proxy-httpd" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582480 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="proxy-httpd" Mar 18 12:35:32 crc kubenswrapper[4975]: E0318 12:35:32.582500 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="sg-core" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582508 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="sg-core" Mar 18 12:35:32 crc kubenswrapper[4975]: E0318 12:35:32.582525 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f967c09a-49f5-4a4c-a1a9-7bb2da157132" containerName="placement-log" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582533 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f967c09a-49f5-4a4c-a1a9-7bb2da157132" containerName="placement-log" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582755 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerName="horizon-log" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582773 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f967c09a-49f5-4a4c-a1a9-7bb2da157132" containerName="placement-log" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582787 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="ceilometer-notification-agent" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582794 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff4782d5-d7a3-47ee-af16-ccef8ed4cdd7" containerName="horizon" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582805 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="sg-core" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582814 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="ceilometer-central-agent" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582826 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f967c09a-49f5-4a4c-a1a9-7bb2da157132" containerName="placement-api" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.582835 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" containerName="proxy-httpd" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.584403 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.586520 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.587213 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.596331 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.598976 4975 scope.go:117] "RemoveContainer" containerID="9e42adc1d7526260d3e1f5be111a614b71c9b47d9bbf36e86d0cdb8283513e0f" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.627465 4975 scope.go:117] "RemoveContainer" containerID="d05da0446a9ba507831161d869ce384dbcbd26e02423ca54916694bf90eca97b" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.644707 4975 scope.go:117] "RemoveContainer" containerID="b2a95c4619c0a6f8d66ab0a94214574b9ba89530c8e4a569a274fbe5be88e1eb" Mar 18 12:35:32 crc kubenswrapper[4975]: E0318 12:35:32.645124 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2a95c4619c0a6f8d66ab0a94214574b9ba89530c8e4a569a274fbe5be88e1eb\": container with ID starting with b2a95c4619c0a6f8d66ab0a94214574b9ba89530c8e4a569a274fbe5be88e1eb not found: ID does not exist" containerID="b2a95c4619c0a6f8d66ab0a94214574b9ba89530c8e4a569a274fbe5be88e1eb" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.645153 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2a95c4619c0a6f8d66ab0a94214574b9ba89530c8e4a569a274fbe5be88e1eb"} err="failed to get container status \"b2a95c4619c0a6f8d66ab0a94214574b9ba89530c8e4a569a274fbe5be88e1eb\": rpc error: code = NotFound desc = could not find container \"b2a95c4619c0a6f8d66ab0a94214574b9ba89530c8e4a569a274fbe5be88e1eb\": container with ID starting with b2a95c4619c0a6f8d66ab0a94214574b9ba89530c8e4a569a274fbe5be88e1eb not found: ID does not exist" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.645174 4975 scope.go:117] "RemoveContainer" containerID="d1f030d91c85444f92a522a15d28c5a82eb0dcac918ab12f7da501804d6e1e18" Mar 18 12:35:32 crc kubenswrapper[4975]: E0318 12:35:32.645485 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1f030d91c85444f92a522a15d28c5a82eb0dcac918ab12f7da501804d6e1e18\": container with ID starting with d1f030d91c85444f92a522a15d28c5a82eb0dcac918ab12f7da501804d6e1e18 not found: ID does not exist" containerID="d1f030d91c85444f92a522a15d28c5a82eb0dcac918ab12f7da501804d6e1e18" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.645504 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f030d91c85444f92a522a15d28c5a82eb0dcac918ab12f7da501804d6e1e18"} err="failed to get container status \"d1f030d91c85444f92a522a15d28c5a82eb0dcac918ab12f7da501804d6e1e18\": rpc error: code = NotFound desc = could not find container \"d1f030d91c85444f92a522a15d28c5a82eb0dcac918ab12f7da501804d6e1e18\": container with ID starting with d1f030d91c85444f92a522a15d28c5a82eb0dcac918ab12f7da501804d6e1e18 not found: ID does not exist" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.645515 4975 scope.go:117] "RemoveContainer" containerID="9e42adc1d7526260d3e1f5be111a614b71c9b47d9bbf36e86d0cdb8283513e0f" Mar 18 12:35:32 crc kubenswrapper[4975]: E0318 12:35:32.645685 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e42adc1d7526260d3e1f5be111a614b71c9b47d9bbf36e86d0cdb8283513e0f\": container with ID starting with 9e42adc1d7526260d3e1f5be111a614b71c9b47d9bbf36e86d0cdb8283513e0f not found: ID does not exist" containerID="9e42adc1d7526260d3e1f5be111a614b71c9b47d9bbf36e86d0cdb8283513e0f" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.645701 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e42adc1d7526260d3e1f5be111a614b71c9b47d9bbf36e86d0cdb8283513e0f"} err="failed to get container status \"9e42adc1d7526260d3e1f5be111a614b71c9b47d9bbf36e86d0cdb8283513e0f\": rpc error: code = NotFound desc = could not find container \"9e42adc1d7526260d3e1f5be111a614b71c9b47d9bbf36e86d0cdb8283513e0f\": container with ID starting with 9e42adc1d7526260d3e1f5be111a614b71c9b47d9bbf36e86d0cdb8283513e0f not found: ID does not exist" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.645714 4975 scope.go:117] "RemoveContainer" containerID="d05da0446a9ba507831161d869ce384dbcbd26e02423ca54916694bf90eca97b" Mar 18 12:35:32 crc kubenswrapper[4975]: E0318 12:35:32.645956 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d05da0446a9ba507831161d869ce384dbcbd26e02423ca54916694bf90eca97b\": container with ID starting with d05da0446a9ba507831161d869ce384dbcbd26e02423ca54916694bf90eca97b not found: ID does not exist" containerID="d05da0446a9ba507831161d869ce384dbcbd26e02423ca54916694bf90eca97b" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.645975 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d05da0446a9ba507831161d869ce384dbcbd26e02423ca54916694bf90eca97b"} err="failed to get container status \"d05da0446a9ba507831161d869ce384dbcbd26e02423ca54916694bf90eca97b\": rpc error: code = NotFound desc = could not find container \"d05da0446a9ba507831161d869ce384dbcbd26e02423ca54916694bf90eca97b\": container with ID starting with d05da0446a9ba507831161d869ce384dbcbd26e02423ca54916694bf90eca97b not found: ID does not exist" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.648262 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-config-data\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.648363 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-scripts\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.648397 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.648439 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-run-httpd\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.648575 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-log-httpd\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.648767 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvxmc\" (UniqueName: \"kubernetes.io/projected/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-kube-api-access-vvxmc\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.648821 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.750121 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-log-httpd\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.750220 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvxmc\" (UniqueName: \"kubernetes.io/projected/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-kube-api-access-vvxmc\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.750252 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.750284 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-config-data\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.750333 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-scripts\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.750356 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.750380 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-run-httpd\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.750712 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-log-httpd\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.750904 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-run-httpd\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.757196 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.757199 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-scripts\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.757504 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.757625 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-config-data\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.772004 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvxmc\" (UniqueName: \"kubernetes.io/projected/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-kube-api-access-vvxmc\") pod \"ceilometer-0\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " pod="openstack/ceilometer-0" Mar 18 12:35:32 crc kubenswrapper[4975]: I0318 12:35:32.909409 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:33 crc kubenswrapper[4975]: I0318 12:35:33.049327 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51cbbbd2-2995-436e-bc62-da9a867a2db4" path="/var/lib/kubelet/pods/51cbbbd2-2995-436e-bc62-da9a867a2db4/volumes" Mar 18 12:35:33 crc kubenswrapper[4975]: I0318 12:35:33.316240 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:33 crc kubenswrapper[4975]: I0318 12:35:33.392089 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:33 crc kubenswrapper[4975]: W0318 12:35:33.396732 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcb42c5b_4674_43ba_9eb8_7654af2ea64c.slice/crio-a8d0fdee4646e761b91276b0adde5b087b1e818d31c6965a9652d41bf715c597 WatchSource:0}: Error finding container a8d0fdee4646e761b91276b0adde5b087b1e818d31c6965a9652d41bf715c597: Status 404 returned error can't find the container with id a8d0fdee4646e761b91276b0adde5b087b1e818d31c6965a9652d41bf715c597 Mar 18 12:35:33 crc kubenswrapper[4975]: I0318 12:35:33.507110 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcb42c5b-4674-43ba-9eb8-7654af2ea64c","Type":"ContainerStarted","Data":"a8d0fdee4646e761b91276b0adde5b087b1e818d31c6965a9652d41bf715c597"} Mar 18 12:35:34 crc kubenswrapper[4975]: I0318 12:35:34.520675 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcb42c5b-4674-43ba-9eb8-7654af2ea64c","Type":"ContainerStarted","Data":"de1fed08432296d3a3d5ae1524101f787c61c99553bf09a033fffd17ecab314a"} Mar 18 12:35:35 crc kubenswrapper[4975]: I0318 12:35:35.532252 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcb42c5b-4674-43ba-9eb8-7654af2ea64c","Type":"ContainerStarted","Data":"24401754f27fca3d796510891c2117556fbf66d2d506fc8753666be826bafefe"} Mar 18 12:35:35 crc kubenswrapper[4975]: I0318 12:35:35.652618 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:35:35 crc kubenswrapper[4975]: I0318 12:35:35.653493 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="71ec6dfe-9c62-4027-a59f-fc13c24dd809" containerName="glance-httpd" containerID="cri-o://1ff4cbf09d61c6c3d56720ba50a74563fa720913cbaf662951ca4c4549617c84" gracePeriod=30 Mar 18 12:35:35 crc kubenswrapper[4975]: I0318 12:35:35.653497 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="71ec6dfe-9c62-4027-a59f-fc13c24dd809" containerName="glance-log" containerID="cri-o://547ebb2f0b54f4570c9dd75ee0eb6d6bbd3546bb0b427501074ea908ea63763f" gracePeriod=30 Mar 18 12:35:36 crc kubenswrapper[4975]: I0318 12:35:36.557510 4975 generic.go:334] "Generic (PLEG): container finished" podID="71ec6dfe-9c62-4027-a59f-fc13c24dd809" containerID="547ebb2f0b54f4570c9dd75ee0eb6d6bbd3546bb0b427501074ea908ea63763f" exitCode=143 Mar 18 12:35:36 crc kubenswrapper[4975]: I0318 12:35:36.557646 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"71ec6dfe-9c62-4027-a59f-fc13c24dd809","Type":"ContainerDied","Data":"547ebb2f0b54f4570c9dd75ee0eb6d6bbd3546bb0b427501074ea908ea63763f"} Mar 18 12:35:36 crc kubenswrapper[4975]: I0318 12:35:36.562261 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcb42c5b-4674-43ba-9eb8-7654af2ea64c","Type":"ContainerStarted","Data":"6929d2abe0cf2fc1b67efe16d4d513ea7774bb800d5fd06c7c71ce00c66e8d6c"} Mar 18 12:35:36 crc kubenswrapper[4975]: I0318 12:35:36.853908 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:35:36 crc kubenswrapper[4975]: I0318 12:35:36.854500 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cab6fd53-f170-4c86-b5eb-3590e593077e" containerName="glance-log" containerID="cri-o://80b75050c0f0e18d1719cd234447641097a01bc47df078532bbb63139fc9fa89" gracePeriod=30 Mar 18 12:35:36 crc kubenswrapper[4975]: I0318 12:35:36.854569 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cab6fd53-f170-4c86-b5eb-3590e593077e" containerName="glance-httpd" containerID="cri-o://89355303cae12366ce92616e4a15c650c55c9e68f847c6ad92d3f445b2cb1d23" gracePeriod=30 Mar 18 12:35:37 crc kubenswrapper[4975]: I0318 12:35:37.572586 4975 generic.go:334] "Generic (PLEG): container finished" podID="cab6fd53-f170-4c86-b5eb-3590e593077e" containerID="80b75050c0f0e18d1719cd234447641097a01bc47df078532bbb63139fc9fa89" exitCode=143 Mar 18 12:35:37 crc kubenswrapper[4975]: I0318 12:35:37.572638 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cab6fd53-f170-4c86-b5eb-3590e593077e","Type":"ContainerDied","Data":"80b75050c0f0e18d1719cd234447641097a01bc47df078532bbb63139fc9fa89"} Mar 18 12:35:38 crc kubenswrapper[4975]: I0318 12:35:38.588248 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcb42c5b-4674-43ba-9eb8-7654af2ea64c","Type":"ContainerStarted","Data":"8a5765a43db10d16f434d26357c17b55b296c0d200bf79d49c47e4f3d036a30d"} Mar 18 12:35:38 crc kubenswrapper[4975]: I0318 12:35:38.588644 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="ceilometer-central-agent" containerID="cri-o://de1fed08432296d3a3d5ae1524101f787c61c99553bf09a033fffd17ecab314a" gracePeriod=30 Mar 18 12:35:38 crc kubenswrapper[4975]: I0318 12:35:38.588996 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:35:38 crc kubenswrapper[4975]: I0318 12:35:38.589029 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="proxy-httpd" containerID="cri-o://8a5765a43db10d16f434d26357c17b55b296c0d200bf79d49c47e4f3d036a30d" gracePeriod=30 Mar 18 12:35:38 crc kubenswrapper[4975]: I0318 12:35:38.589082 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="sg-core" containerID="cri-o://6929d2abe0cf2fc1b67efe16d4d513ea7774bb800d5fd06c7c71ce00c66e8d6c" gracePeriod=30 Mar 18 12:35:38 crc kubenswrapper[4975]: I0318 12:35:38.589132 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="ceilometer-notification-agent" containerID="cri-o://24401754f27fca3d796510891c2117556fbf66d2d506fc8753666be826bafefe" gracePeriod=30 Mar 18 12:35:38 crc kubenswrapper[4975]: I0318 12:35:38.614056 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.250848103 podStartE2EDuration="6.614030414s" podCreationTimestamp="2026-03-18 12:35:32 +0000 UTC" firstStartedPulling="2026-03-18 12:35:33.400616247 +0000 UTC m=+1519.115016826" lastFinishedPulling="2026-03-18 12:35:37.763798558 +0000 UTC m=+1523.478199137" observedRunningTime="2026-03-18 12:35:38.606933429 +0000 UTC m=+1524.321334018" watchObservedRunningTime="2026-03-18 12:35:38.614030414 +0000 UTC m=+1524.328430993" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.252243 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.393775 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.393842 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71ec6dfe-9c62-4027-a59f-fc13c24dd809-logs\") pod \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.393876 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71ec6dfe-9c62-4027-a59f-fc13c24dd809-httpd-run\") pod \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.393926 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-combined-ca-bundle\") pod \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.393948 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn9rm\" (UniqueName: \"kubernetes.io/projected/71ec6dfe-9c62-4027-a59f-fc13c24dd809-kube-api-access-pn9rm\") pod \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.393971 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-public-tls-certs\") pod \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.394218 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-scripts\") pod \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.394267 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-config-data\") pod \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\" (UID: \"71ec6dfe-9c62-4027-a59f-fc13c24dd809\") " Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.395798 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71ec6dfe-9c62-4027-a59f-fc13c24dd809-logs" (OuterVolumeSpecName: "logs") pod "71ec6dfe-9c62-4027-a59f-fc13c24dd809" (UID: "71ec6dfe-9c62-4027-a59f-fc13c24dd809"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.396332 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71ec6dfe-9c62-4027-a59f-fc13c24dd809-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "71ec6dfe-9c62-4027-a59f-fc13c24dd809" (UID: "71ec6dfe-9c62-4027-a59f-fc13c24dd809"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.403132 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "71ec6dfe-9c62-4027-a59f-fc13c24dd809" (UID: "71ec6dfe-9c62-4027-a59f-fc13c24dd809"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.403193 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-scripts" (OuterVolumeSpecName: "scripts") pod "71ec6dfe-9c62-4027-a59f-fc13c24dd809" (UID: "71ec6dfe-9c62-4027-a59f-fc13c24dd809"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.403381 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ec6dfe-9c62-4027-a59f-fc13c24dd809-kube-api-access-pn9rm" (OuterVolumeSpecName: "kube-api-access-pn9rm") pod "71ec6dfe-9c62-4027-a59f-fc13c24dd809" (UID: "71ec6dfe-9c62-4027-a59f-fc13c24dd809"). InnerVolumeSpecName "kube-api-access-pn9rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.422271 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71ec6dfe-9c62-4027-a59f-fc13c24dd809" (UID: "71ec6dfe-9c62-4027-a59f-fc13c24dd809"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.440614 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "71ec6dfe-9c62-4027-a59f-fc13c24dd809" (UID: "71ec6dfe-9c62-4027-a59f-fc13c24dd809"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.472959 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-config-data" (OuterVolumeSpecName: "config-data") pod "71ec6dfe-9c62-4027-a59f-fc13c24dd809" (UID: "71ec6dfe-9c62-4027-a59f-fc13c24dd809"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.496172 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.496242 4975 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.496258 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71ec6dfe-9c62-4027-a59f-fc13c24dd809-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.496271 4975 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71ec6dfe-9c62-4027-a59f-fc13c24dd809-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.496286 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.496297 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn9rm\" (UniqueName: \"kubernetes.io/projected/71ec6dfe-9c62-4027-a59f-fc13c24dd809-kube-api-access-pn9rm\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.496308 4975 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.496318 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71ec6dfe-9c62-4027-a59f-fc13c24dd809-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.519680 4975 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.597386 4975 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.599772 4975 generic.go:334] "Generic (PLEG): container finished" podID="71ec6dfe-9c62-4027-a59f-fc13c24dd809" containerID="1ff4cbf09d61c6c3d56720ba50a74563fa720913cbaf662951ca4c4549617c84" exitCode=0 Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.599841 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"71ec6dfe-9c62-4027-a59f-fc13c24dd809","Type":"ContainerDied","Data":"1ff4cbf09d61c6c3d56720ba50a74563fa720913cbaf662951ca4c4549617c84"} Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.599893 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"71ec6dfe-9c62-4027-a59f-fc13c24dd809","Type":"ContainerDied","Data":"99c3e047426f11ac9e988dcaa9e990f1cfea4b9eeccbe65ffe5de614d6a5a671"} Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.599915 4975 scope.go:117] "RemoveContainer" containerID="1ff4cbf09d61c6c3d56720ba50a74563fa720913cbaf662951ca4c4549617c84" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.600032 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.616416 4975 generic.go:334] "Generic (PLEG): container finished" podID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerID="8a5765a43db10d16f434d26357c17b55b296c0d200bf79d49c47e4f3d036a30d" exitCode=0 Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.616475 4975 generic.go:334] "Generic (PLEG): container finished" podID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerID="6929d2abe0cf2fc1b67efe16d4d513ea7774bb800d5fd06c7c71ce00c66e8d6c" exitCode=2 Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.616487 4975 generic.go:334] "Generic (PLEG): container finished" podID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerID="24401754f27fca3d796510891c2117556fbf66d2d506fc8753666be826bafefe" exitCode=0 Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.616532 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcb42c5b-4674-43ba-9eb8-7654af2ea64c","Type":"ContainerDied","Data":"8a5765a43db10d16f434d26357c17b55b296c0d200bf79d49c47e4f3d036a30d"} Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.616567 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcb42c5b-4674-43ba-9eb8-7654af2ea64c","Type":"ContainerDied","Data":"6929d2abe0cf2fc1b67efe16d4d513ea7774bb800d5fd06c7c71ce00c66e8d6c"} Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.616583 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcb42c5b-4674-43ba-9eb8-7654af2ea64c","Type":"ContainerDied","Data":"24401754f27fca3d796510891c2117556fbf66d2d506fc8753666be826bafefe"} Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.638581 4975 scope.go:117] "RemoveContainer" containerID="547ebb2f0b54f4570c9dd75ee0eb6d6bbd3546bb0b427501074ea908ea63763f" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.657016 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.668344 4975 scope.go:117] "RemoveContainer" containerID="1ff4cbf09d61c6c3d56720ba50a74563fa720913cbaf662951ca4c4549617c84" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.668933 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:35:39 crc kubenswrapper[4975]: E0318 12:35:39.669398 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff4cbf09d61c6c3d56720ba50a74563fa720913cbaf662951ca4c4549617c84\": container with ID starting with 1ff4cbf09d61c6c3d56720ba50a74563fa720913cbaf662951ca4c4549617c84 not found: ID does not exist" containerID="1ff4cbf09d61c6c3d56720ba50a74563fa720913cbaf662951ca4c4549617c84" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.669436 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff4cbf09d61c6c3d56720ba50a74563fa720913cbaf662951ca4c4549617c84"} err="failed to get container status \"1ff4cbf09d61c6c3d56720ba50a74563fa720913cbaf662951ca4c4549617c84\": rpc error: code = NotFound desc = could not find container \"1ff4cbf09d61c6c3d56720ba50a74563fa720913cbaf662951ca4c4549617c84\": container with ID starting with 1ff4cbf09d61c6c3d56720ba50a74563fa720913cbaf662951ca4c4549617c84 not found: ID does not exist" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.669456 4975 scope.go:117] "RemoveContainer" containerID="547ebb2f0b54f4570c9dd75ee0eb6d6bbd3546bb0b427501074ea908ea63763f" Mar 18 12:35:39 crc kubenswrapper[4975]: E0318 12:35:39.669666 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547ebb2f0b54f4570c9dd75ee0eb6d6bbd3546bb0b427501074ea908ea63763f\": container with ID starting with 547ebb2f0b54f4570c9dd75ee0eb6d6bbd3546bb0b427501074ea908ea63763f not found: ID does not exist" containerID="547ebb2f0b54f4570c9dd75ee0eb6d6bbd3546bb0b427501074ea908ea63763f" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.669708 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547ebb2f0b54f4570c9dd75ee0eb6d6bbd3546bb0b427501074ea908ea63763f"} err="failed to get container status \"547ebb2f0b54f4570c9dd75ee0eb6d6bbd3546bb0b427501074ea908ea63763f\": rpc error: code = NotFound desc = could not find container \"547ebb2f0b54f4570c9dd75ee0eb6d6bbd3546bb0b427501074ea908ea63763f\": container with ID starting with 547ebb2f0b54f4570c9dd75ee0eb6d6bbd3546bb0b427501074ea908ea63763f not found: ID does not exist" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.685479 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:35:39 crc kubenswrapper[4975]: E0318 12:35:39.685851 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ec6dfe-9c62-4027-a59f-fc13c24dd809" containerName="glance-httpd" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.685943 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ec6dfe-9c62-4027-a59f-fc13c24dd809" containerName="glance-httpd" Mar 18 12:35:39 crc kubenswrapper[4975]: E0318 12:35:39.685999 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ec6dfe-9c62-4027-a59f-fc13c24dd809" containerName="glance-log" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.686007 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ec6dfe-9c62-4027-a59f-fc13c24dd809" containerName="glance-log" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.686222 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ec6dfe-9c62-4027-a59f-fc13c24dd809" containerName="glance-httpd" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.686245 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ec6dfe-9c62-4027-a59f-fc13c24dd809" containerName="glance-log" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.689957 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.693911 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.703696 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.706209 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.805131 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.805180 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87bb4a85-e699-429d-b354-f9e75d5eb9de-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.805201 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87bb4a85-e699-429d-b354-f9e75d5eb9de-config-data\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.805217 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87bb4a85-e699-429d-b354-f9e75d5eb9de-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.805233 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf9fv\" (UniqueName: \"kubernetes.io/projected/87bb4a85-e699-429d-b354-f9e75d5eb9de-kube-api-access-pf9fv\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.805324 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87bb4a85-e699-429d-b354-f9e75d5eb9de-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.805364 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87bb4a85-e699-429d-b354-f9e75d5eb9de-logs\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.805387 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87bb4a85-e699-429d-b354-f9e75d5eb9de-scripts\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.907351 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87bb4a85-e699-429d-b354-f9e75d5eb9de-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.907415 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87bb4a85-e699-429d-b354-f9e75d5eb9de-logs\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.907442 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87bb4a85-e699-429d-b354-f9e75d5eb9de-scripts\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.907525 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.907554 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87bb4a85-e699-429d-b354-f9e75d5eb9de-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.907573 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87bb4a85-e699-429d-b354-f9e75d5eb9de-config-data\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.907591 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87bb4a85-e699-429d-b354-f9e75d5eb9de-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.907607 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf9fv\" (UniqueName: \"kubernetes.io/projected/87bb4a85-e699-429d-b354-f9e75d5eb9de-kube-api-access-pf9fv\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.908173 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87bb4a85-e699-429d-b354-f9e75d5eb9de-logs\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.908348 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/87bb4a85-e699-429d-b354-f9e75d5eb9de-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.908490 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.911754 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87bb4a85-e699-429d-b354-f9e75d5eb9de-scripts\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.912041 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87bb4a85-e699-429d-b354-f9e75d5eb9de-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.913825 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87bb4a85-e699-429d-b354-f9e75d5eb9de-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.916996 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87bb4a85-e699-429d-b354-f9e75d5eb9de-config-data\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.933806 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf9fv\" (UniqueName: \"kubernetes.io/projected/87bb4a85-e699-429d-b354-f9e75d5eb9de-kube-api-access-pf9fv\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:39 crc kubenswrapper[4975]: I0318 12:35:39.935196 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"87bb4a85-e699-429d-b354-f9e75d5eb9de\") " pod="openstack/glance-default-external-api-0" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.098000 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.542763 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.621761 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-scripts\") pod \"cab6fd53-f170-4c86-b5eb-3590e593077e\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.621821 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-internal-tls-certs\") pod \"cab6fd53-f170-4c86-b5eb-3590e593077e\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.621924 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cab6fd53-f170-4c86-b5eb-3590e593077e\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.621983 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-config-data\") pod \"cab6fd53-f170-4c86-b5eb-3590e593077e\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.622038 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cab6fd53-f170-4c86-b5eb-3590e593077e-httpd-run\") pod \"cab6fd53-f170-4c86-b5eb-3590e593077e\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.622062 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-combined-ca-bundle\") pod \"cab6fd53-f170-4c86-b5eb-3590e593077e\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.622091 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dw7t\" (UniqueName: \"kubernetes.io/projected/cab6fd53-f170-4c86-b5eb-3590e593077e-kube-api-access-7dw7t\") pod \"cab6fd53-f170-4c86-b5eb-3590e593077e\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.622110 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab6fd53-f170-4c86-b5eb-3590e593077e-logs\") pod \"cab6fd53-f170-4c86-b5eb-3590e593077e\" (UID: \"cab6fd53-f170-4c86-b5eb-3590e593077e\") " Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.622493 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cab6fd53-f170-4c86-b5eb-3590e593077e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cab6fd53-f170-4c86-b5eb-3590e593077e" (UID: "cab6fd53-f170-4c86-b5eb-3590e593077e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.622564 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cab6fd53-f170-4c86-b5eb-3590e593077e-logs" (OuterVolumeSpecName: "logs") pod "cab6fd53-f170-4c86-b5eb-3590e593077e" (UID: "cab6fd53-f170-4c86-b5eb-3590e593077e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.632170 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "cab6fd53-f170-4c86-b5eb-3590e593077e" (UID: "cab6fd53-f170-4c86-b5eb-3590e593077e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.632231 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cab6fd53-f170-4c86-b5eb-3590e593077e-kube-api-access-7dw7t" (OuterVolumeSpecName: "kube-api-access-7dw7t") pod "cab6fd53-f170-4c86-b5eb-3590e593077e" (UID: "cab6fd53-f170-4c86-b5eb-3590e593077e"). InnerVolumeSpecName "kube-api-access-7dw7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.633942 4975 generic.go:334] "Generic (PLEG): container finished" podID="cab6fd53-f170-4c86-b5eb-3590e593077e" containerID="89355303cae12366ce92616e4a15c650c55c9e68f847c6ad92d3f445b2cb1d23" exitCode=0 Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.634014 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.634036 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cab6fd53-f170-4c86-b5eb-3590e593077e","Type":"ContainerDied","Data":"89355303cae12366ce92616e4a15c650c55c9e68f847c6ad92d3f445b2cb1d23"} Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.634305 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cab6fd53-f170-4c86-b5eb-3590e593077e","Type":"ContainerDied","Data":"cbba75c95ef2c8b19267cb12a50564144e1b272035837c9a7bfefb493b93977d"} Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.634327 4975 scope.go:117] "RemoveContainer" containerID="89355303cae12366ce92616e4a15c650c55c9e68f847c6ad92d3f445b2cb1d23" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.634857 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-scripts" (OuterVolumeSpecName: "scripts") pod "cab6fd53-f170-4c86-b5eb-3590e593077e" (UID: "cab6fd53-f170-4c86-b5eb-3590e593077e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.685039 4975 scope.go:117] "RemoveContainer" containerID="80b75050c0f0e18d1719cd234447641097a01bc47df078532bbb63139fc9fa89" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.706217 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cab6fd53-f170-4c86-b5eb-3590e593077e" (UID: "cab6fd53-f170-4c86-b5eb-3590e593077e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.710342 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cab6fd53-f170-4c86-b5eb-3590e593077e" (UID: "cab6fd53-f170-4c86-b5eb-3590e593077e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.712028 4975 scope.go:117] "RemoveContainer" containerID="89355303cae12366ce92616e4a15c650c55c9e68f847c6ad92d3f445b2cb1d23" Mar 18 12:35:40 crc kubenswrapper[4975]: E0318 12:35:40.712598 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89355303cae12366ce92616e4a15c650c55c9e68f847c6ad92d3f445b2cb1d23\": container with ID starting with 89355303cae12366ce92616e4a15c650c55c9e68f847c6ad92d3f445b2cb1d23 not found: ID does not exist" containerID="89355303cae12366ce92616e4a15c650c55c9e68f847c6ad92d3f445b2cb1d23" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.712647 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89355303cae12366ce92616e4a15c650c55c9e68f847c6ad92d3f445b2cb1d23"} err="failed to get container status \"89355303cae12366ce92616e4a15c650c55c9e68f847c6ad92d3f445b2cb1d23\": rpc error: code = NotFound desc = could not find container \"89355303cae12366ce92616e4a15c650c55c9e68f847c6ad92d3f445b2cb1d23\": container with ID starting with 89355303cae12366ce92616e4a15c650c55c9e68f847c6ad92d3f445b2cb1d23 not found: ID does not exist" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.712687 4975 scope.go:117] "RemoveContainer" containerID="80b75050c0f0e18d1719cd234447641097a01bc47df078532bbb63139fc9fa89" Mar 18 12:35:40 crc kubenswrapper[4975]: E0318 12:35:40.715329 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80b75050c0f0e18d1719cd234447641097a01bc47df078532bbb63139fc9fa89\": container with ID starting with 80b75050c0f0e18d1719cd234447641097a01bc47df078532bbb63139fc9fa89 not found: ID does not exist" containerID="80b75050c0f0e18d1719cd234447641097a01bc47df078532bbb63139fc9fa89" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.715371 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80b75050c0f0e18d1719cd234447641097a01bc47df078532bbb63139fc9fa89"} err="failed to get container status \"80b75050c0f0e18d1719cd234447641097a01bc47df078532bbb63139fc9fa89\": rpc error: code = NotFound desc = could not find container \"80b75050c0f0e18d1719cd234447641097a01bc47df078532bbb63139fc9fa89\": container with ID starting with 80b75050c0f0e18d1719cd234447641097a01bc47df078532bbb63139fc9fa89 not found: ID does not exist" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.724517 4975 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cab6fd53-f170-4c86-b5eb-3590e593077e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.724548 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.724559 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dw7t\" (UniqueName: \"kubernetes.io/projected/cab6fd53-f170-4c86-b5eb-3590e593077e-kube-api-access-7dw7t\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.724567 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab6fd53-f170-4c86-b5eb-3590e593077e-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.724575 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.724582 4975 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.724610 4975 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.741779 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-config-data" (OuterVolumeSpecName: "config-data") pod "cab6fd53-f170-4c86-b5eb-3590e593077e" (UID: "cab6fd53-f170-4c86-b5eb-3590e593077e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.758369 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.764021 4975 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.826700 4975 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.826742 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab6fd53-f170-4c86-b5eb-3590e593077e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.968036 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.976475 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.986783 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:35:40 crc kubenswrapper[4975]: E0318 12:35:40.987342 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cab6fd53-f170-4c86-b5eb-3590e593077e" containerName="glance-log" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.987359 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="cab6fd53-f170-4c86-b5eb-3590e593077e" containerName="glance-log" Mar 18 12:35:40 crc kubenswrapper[4975]: E0318 12:35:40.987388 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cab6fd53-f170-4c86-b5eb-3590e593077e" containerName="glance-httpd" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.987395 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="cab6fd53-f170-4c86-b5eb-3590e593077e" containerName="glance-httpd" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.987556 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="cab6fd53-f170-4c86-b5eb-3590e593077e" containerName="glance-httpd" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.987572 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="cab6fd53-f170-4c86-b5eb-3590e593077e" containerName="glance-log" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.988464 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.990917 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 12:35:40 crc kubenswrapper[4975]: I0318 12:35:40.991219 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.039357 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ec6dfe-9c62-4027-a59f-fc13c24dd809" path="/var/lib/kubelet/pods/71ec6dfe-9c62-4027-a59f-fc13c24dd809/volumes" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.040328 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cab6fd53-f170-4c86-b5eb-3590e593077e" path="/var/lib/kubelet/pods/cab6fd53-f170-4c86-b5eb-3590e593077e/volumes" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.041040 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.132850 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a3dc59e-97e5-435c-b3c2-286d75774bbc-logs\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.133027 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a3dc59e-97e5-435c-b3c2-286d75774bbc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.133060 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3dc59e-97e5-435c-b3c2-286d75774bbc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.133145 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.133164 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a3dc59e-97e5-435c-b3c2-286d75774bbc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.133202 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjd4h\" (UniqueName: \"kubernetes.io/projected/9a3dc59e-97e5-435c-b3c2-286d75774bbc-kube-api-access-cjd4h\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.133254 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3dc59e-97e5-435c-b3c2-286d75774bbc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.133304 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a3dc59e-97e5-435c-b3c2-286d75774bbc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.234918 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a3dc59e-97e5-435c-b3c2-286d75774bbc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.234972 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3dc59e-97e5-435c-b3c2-286d75774bbc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.235048 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.235072 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a3dc59e-97e5-435c-b3c2-286d75774bbc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.235113 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjd4h\" (UniqueName: \"kubernetes.io/projected/9a3dc59e-97e5-435c-b3c2-286d75774bbc-kube-api-access-cjd4h\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.235162 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3dc59e-97e5-435c-b3c2-286d75774bbc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.235192 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a3dc59e-97e5-435c-b3c2-286d75774bbc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.235256 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a3dc59e-97e5-435c-b3c2-286d75774bbc-logs\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.235320 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.235716 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a3dc59e-97e5-435c-b3c2-286d75774bbc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.236745 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a3dc59e-97e5-435c-b3c2-286d75774bbc-logs\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.242111 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3dc59e-97e5-435c-b3c2-286d75774bbc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.242332 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a3dc59e-97e5-435c-b3c2-286d75774bbc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.242762 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3dc59e-97e5-435c-b3c2-286d75774bbc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.243258 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a3dc59e-97e5-435c-b3c2-286d75774bbc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.260754 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjd4h\" (UniqueName: \"kubernetes.io/projected/9a3dc59e-97e5-435c-b3c2-286d75774bbc-kube-api-access-cjd4h\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.263988 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a3dc59e-97e5-435c-b3c2-286d75774bbc\") " pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.305803 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.677200 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"87bb4a85-e699-429d-b354-f9e75d5eb9de","Type":"ContainerStarted","Data":"2d197f1d8f7f25787f7ac397940f9a4abdaa7689322d77601d585b65e3bf5b84"} Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.677573 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"87bb4a85-e699-429d-b354-f9e75d5eb9de","Type":"ContainerStarted","Data":"818ece0630c1b1f27e9782dd7f95361f623008ab41725ca1477563a37d772382"} Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.679241 4975 generic.go:334] "Generic (PLEG): container finished" podID="67cbe265-b297-4ad8-af53-703b8549d8e6" containerID="2027ac0f879d6bc658160a364d83c9822493eb86828b9ea3300924d8c2837379" exitCode=0 Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.679296 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x74s8" event={"ID":"67cbe265-b297-4ad8-af53-703b8549d8e6","Type":"ContainerDied","Data":"2027ac0f879d6bc658160a364d83c9822493eb86828b9ea3300924d8c2837379"} Mar 18 12:35:41 crc kubenswrapper[4975]: I0318 12:35:41.856550 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 12:35:41 crc kubenswrapper[4975]: W0318 12:35:41.868822 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a3dc59e_97e5_435c_b3c2_286d75774bbc.slice/crio-7bd49444926ddaec6346e8277c8be509d1c31abbc29460e96fe865c87f7f6f0e WatchSource:0}: Error finding container 7bd49444926ddaec6346e8277c8be509d1c31abbc29460e96fe865c87f7f6f0e: Status 404 returned error can't find the container with id 7bd49444926ddaec6346e8277c8be509d1c31abbc29460e96fe865c87f7f6f0e Mar 18 12:35:42 crc kubenswrapper[4975]: I0318 12:35:42.718625 4975 generic.go:334] "Generic (PLEG): container finished" podID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerID="de1fed08432296d3a3d5ae1524101f787c61c99553bf09a033fffd17ecab314a" exitCode=0 Mar 18 12:35:42 crc kubenswrapper[4975]: I0318 12:35:42.718660 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcb42c5b-4674-43ba-9eb8-7654af2ea64c","Type":"ContainerDied","Data":"de1fed08432296d3a3d5ae1524101f787c61c99553bf09a033fffd17ecab314a"} Mar 18 12:35:42 crc kubenswrapper[4975]: I0318 12:35:42.721778 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a3dc59e-97e5-435c-b3c2-286d75774bbc","Type":"ContainerStarted","Data":"a724238132f5dc302cd7dd83e3a6f7371d7d3f8036a064e26cf2c39d0eb95505"} Mar 18 12:35:42 crc kubenswrapper[4975]: I0318 12:35:42.721840 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a3dc59e-97e5-435c-b3c2-286d75774bbc","Type":"ContainerStarted","Data":"7bd49444926ddaec6346e8277c8be509d1c31abbc29460e96fe865c87f7f6f0e"} Mar 18 12:35:42 crc kubenswrapper[4975]: I0318 12:35:42.725008 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"87bb4a85-e699-429d-b354-f9e75d5eb9de","Type":"ContainerStarted","Data":"6d39506facc18c0e8a282cc37071bbd4e13230566e4d88c8e2b3d865386ee331"} Mar 18 12:35:42 crc kubenswrapper[4975]: I0318 12:35:42.748976 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.7489549589999998 podStartE2EDuration="3.748954959s" podCreationTimestamp="2026-03-18 12:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:42.744306411 +0000 UTC m=+1528.458707000" watchObservedRunningTime="2026-03-18 12:35:42.748954959 +0000 UTC m=+1528.463355558" Mar 18 12:35:42 crc kubenswrapper[4975]: I0318 12:35:42.966701 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.076195 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-sg-core-conf-yaml\") pod \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.076322 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-run-httpd\") pod \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.076385 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-config-data\") pod \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.076421 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-combined-ca-bundle\") pod \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.076518 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvxmc\" (UniqueName: \"kubernetes.io/projected/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-kube-api-access-vvxmc\") pod \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.076560 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-log-httpd\") pod \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.076587 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-scripts\") pod \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\" (UID: \"bcb42c5b-4674-43ba-9eb8-7654af2ea64c\") " Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.076785 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bcb42c5b-4674-43ba-9eb8-7654af2ea64c" (UID: "bcb42c5b-4674-43ba-9eb8-7654af2ea64c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.079324 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bcb42c5b-4674-43ba-9eb8-7654af2ea64c" (UID: "bcb42c5b-4674-43ba-9eb8-7654af2ea64c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.082573 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-scripts" (OuterVolumeSpecName: "scripts") pod "bcb42c5b-4674-43ba-9eb8-7654af2ea64c" (UID: "bcb42c5b-4674-43ba-9eb8-7654af2ea64c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.084046 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-kube-api-access-vvxmc" (OuterVolumeSpecName: "kube-api-access-vvxmc") pod "bcb42c5b-4674-43ba-9eb8-7654af2ea64c" (UID: "bcb42c5b-4674-43ba-9eb8-7654af2ea64c"). InnerVolumeSpecName "kube-api-access-vvxmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.119924 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bcb42c5b-4674-43ba-9eb8-7654af2ea64c" (UID: "bcb42c5b-4674-43ba-9eb8-7654af2ea64c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.173077 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcb42c5b-4674-43ba-9eb8-7654af2ea64c" (UID: "bcb42c5b-4674-43ba-9eb8-7654af2ea64c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.179612 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvxmc\" (UniqueName: \"kubernetes.io/projected/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-kube-api-access-vvxmc\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.179651 4975 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.179666 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.179679 4975 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.179690 4975 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.179701 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.195472 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.207380 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-config-data" (OuterVolumeSpecName: "config-data") pod "bcb42c5b-4674-43ba-9eb8-7654af2ea64c" (UID: "bcb42c5b-4674-43ba-9eb8-7654af2ea64c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.280587 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-combined-ca-bundle\") pod \"67cbe265-b297-4ad8-af53-703b8549d8e6\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.280677 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-scripts\") pod \"67cbe265-b297-4ad8-af53-703b8549d8e6\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.280722 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcbdq\" (UniqueName: \"kubernetes.io/projected/67cbe265-b297-4ad8-af53-703b8549d8e6-kube-api-access-zcbdq\") pod \"67cbe265-b297-4ad8-af53-703b8549d8e6\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.280836 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-config-data\") pod \"67cbe265-b297-4ad8-af53-703b8549d8e6\" (UID: \"67cbe265-b297-4ad8-af53-703b8549d8e6\") " Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.281478 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb42c5b-4674-43ba-9eb8-7654af2ea64c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.284696 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-scripts" (OuterVolumeSpecName: "scripts") pod "67cbe265-b297-4ad8-af53-703b8549d8e6" (UID: "67cbe265-b297-4ad8-af53-703b8549d8e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.285717 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67cbe265-b297-4ad8-af53-703b8549d8e6-kube-api-access-zcbdq" (OuterVolumeSpecName: "kube-api-access-zcbdq") pod "67cbe265-b297-4ad8-af53-703b8549d8e6" (UID: "67cbe265-b297-4ad8-af53-703b8549d8e6"). InnerVolumeSpecName "kube-api-access-zcbdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.304434 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67cbe265-b297-4ad8-af53-703b8549d8e6" (UID: "67cbe265-b297-4ad8-af53-703b8549d8e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.306470 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-config-data" (OuterVolumeSpecName: "config-data") pod "67cbe265-b297-4ad8-af53-703b8549d8e6" (UID: "67cbe265-b297-4ad8-af53-703b8549d8e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.383104 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.383154 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.383167 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67cbe265-b297-4ad8-af53-703b8549d8e6-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.383177 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcbdq\" (UniqueName: \"kubernetes.io/projected/67cbe265-b297-4ad8-af53-703b8549d8e6-kube-api-access-zcbdq\") on node \"crc\" DevicePath \"\"" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.736417 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a3dc59e-97e5-435c-b3c2-286d75774bbc","Type":"ContainerStarted","Data":"ca80fd8e70a438efe947f2dbe808d772fc1520a133a811ed1415b63dd5d20965"} Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.737828 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x74s8" event={"ID":"67cbe265-b297-4ad8-af53-703b8549d8e6","Type":"ContainerDied","Data":"ab3f4063895d5968d765012b823cd6187efffb41ed9054da02757c8d68321910"} Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.737850 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x74s8" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.737850 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab3f4063895d5968d765012b823cd6187efffb41ed9054da02757c8d68321910" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.741593 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.748030 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcb42c5b-4674-43ba-9eb8-7654af2ea64c","Type":"ContainerDied","Data":"a8d0fdee4646e761b91276b0adde5b087b1e818d31c6965a9652d41bf715c597"} Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.748133 4975 scope.go:117] "RemoveContainer" containerID="8a5765a43db10d16f434d26357c17b55b296c0d200bf79d49c47e4f3d036a30d" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.774517 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.774499535 podStartE2EDuration="3.774499535s" podCreationTimestamp="2026-03-18 12:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:43.765485937 +0000 UTC m=+1529.479886516" watchObservedRunningTime="2026-03-18 12:35:43.774499535 +0000 UTC m=+1529.488900114" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.781435 4975 scope.go:117] "RemoveContainer" containerID="6929d2abe0cf2fc1b67efe16d4d513ea7774bb800d5fd06c7c71ce00c66e8d6c" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.800677 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.816593 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.837469 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:35:43 crc kubenswrapper[4975]: E0318 12:35:43.837907 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="sg-core" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.837920 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="sg-core" Mar 18 12:35:43 crc kubenswrapper[4975]: E0318 12:35:43.837955 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="ceilometer-notification-agent" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.837961 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="ceilometer-notification-agent" Mar 18 12:35:43 crc kubenswrapper[4975]: E0318 12:35:43.837970 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="proxy-httpd" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.837978 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="proxy-httpd" Mar 18 12:35:43 crc kubenswrapper[4975]: E0318 12:35:43.837987 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67cbe265-b297-4ad8-af53-703b8549d8e6" containerName="nova-cell0-conductor-db-sync" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.837993 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="67cbe265-b297-4ad8-af53-703b8549d8e6" containerName="nova-cell0-conductor-db-sync" Mar 18 12:35:43 crc kubenswrapper[4975]: E0318 12:35:43.838024 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="ceilometer-central-agent" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.838029 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="ceilometer-central-agent" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.838190 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="sg-core" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.838203 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="proxy-httpd" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.838213 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="ceilometer-notification-agent" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.838227 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" containerName="ceilometer-central-agent" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.838243 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="67cbe265-b297-4ad8-af53-703b8549d8e6" containerName="nova-cell0-conductor-db-sync" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.838836 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.842913 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-w7sgq" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.843125 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.865195 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.880880 4975 scope.go:117] "RemoveContainer" containerID="24401754f27fca3d796510891c2117556fbf66d2d506fc8753666be826bafefe" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.890217 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.893109 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.893196 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.893319 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwj6g\" (UniqueName: \"kubernetes.io/projected/1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65-kube-api-access-rwj6g\") pod \"nova-cell0-conductor-0\" (UID: \"1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.894704 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.898442 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.901477 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.911291 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.922202 4975 scope.go:117] "RemoveContainer" containerID="de1fed08432296d3a3d5ae1524101f787c61c99553bf09a033fffd17ecab314a" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.995431 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.995494 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.995523 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-config-data\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.995552 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-scripts\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.995620 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-log-httpd\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.995637 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-run-httpd\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.995662 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwj6g\" (UniqueName: \"kubernetes.io/projected/1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65-kube-api-access-rwj6g\") pod \"nova-cell0-conductor-0\" (UID: \"1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.995678 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnfnz\" (UniqueName: \"kubernetes.io/projected/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-kube-api-access-nnfnz\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.995698 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:43 crc kubenswrapper[4975]: I0318 12:35:43.995752 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.001286 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.001498 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.016845 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwj6g\" (UniqueName: \"kubernetes.io/projected/1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65-kube-api-access-rwj6g\") pod \"nova-cell0-conductor-0\" (UID: \"1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65\") " pod="openstack/nova-cell0-conductor-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.110837 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-scripts\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.111272 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-log-httpd\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.111378 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-run-httpd\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.111519 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnfnz\" (UniqueName: \"kubernetes.io/projected/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-kube-api-access-nnfnz\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.111590 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.111782 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.111851 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-config-data\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.113249 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-log-httpd\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.113314 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-run-httpd\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.119085 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-config-data\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.121172 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.121431 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.130588 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-scripts\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.132231 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnfnz\" (UniqueName: \"kubernetes.io/projected/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-kube-api-access-nnfnz\") pod \"ceilometer-0\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " pod="openstack/ceilometer-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.182142 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.219672 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.646665 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.724519 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:35:44 crc kubenswrapper[4975]: W0318 12:35:44.750406 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb5fde15_fc46_4eb2_a28a_e0c5ebae192e.slice/crio-d4c2904fce68647bd81972977093587aa63fbd9c56e66a37c6156da842fc3295 WatchSource:0}: Error finding container d4c2904fce68647bd81972977093587aa63fbd9c56e66a37c6156da842fc3295: Status 404 returned error can't find the container with id d4c2904fce68647bd81972977093587aa63fbd9c56e66a37c6156da842fc3295 Mar 18 12:35:44 crc kubenswrapper[4975]: I0318 12:35:44.751510 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65","Type":"ContainerStarted","Data":"fc52408f78aab97bb759c202610d50757ac0749b78e61d54a7120a93285d491c"} Mar 18 12:35:45 crc kubenswrapper[4975]: I0318 12:35:45.032031 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb42c5b-4674-43ba-9eb8-7654af2ea64c" path="/var/lib/kubelet/pods/bcb42c5b-4674-43ba-9eb8-7654af2ea64c/volumes" Mar 18 12:35:45 crc kubenswrapper[4975]: I0318 12:35:45.762761 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65","Type":"ContainerStarted","Data":"28b7eaa65564800bbf67c2d959c62a944416cc80d906c86cbe35558e14840994"} Mar 18 12:35:45 crc kubenswrapper[4975]: I0318 12:35:45.763387 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 12:35:45 crc kubenswrapper[4975]: I0318 12:35:45.765793 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e","Type":"ContainerStarted","Data":"1d32409bb252e66135d85d9d3d736cbbd7365f4130d69183d7980941d8f1a808"} Mar 18 12:35:45 crc kubenswrapper[4975]: I0318 12:35:45.765828 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e","Type":"ContainerStarted","Data":"d4c2904fce68647bd81972977093587aa63fbd9c56e66a37c6156da842fc3295"} Mar 18 12:35:45 crc kubenswrapper[4975]: I0318 12:35:45.787684 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.787660621 podStartE2EDuration="2.787660621s" podCreationTimestamp="2026-03-18 12:35:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:45.777694627 +0000 UTC m=+1531.492095226" watchObservedRunningTime="2026-03-18 12:35:45.787660621 +0000 UTC m=+1531.502061210" Mar 18 12:35:46 crc kubenswrapper[4975]: I0318 12:35:46.777652 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e","Type":"ContainerStarted","Data":"d554b1376eaa787c52f9cfd715e1e7d55db9cd1ef6d746bd598887ad9279dc9b"} Mar 18 12:35:47 crc kubenswrapper[4975]: I0318 12:35:47.789317 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e","Type":"ContainerStarted","Data":"4b4773bb8da1584587ad8d02c86557a00680cf7450556e60d81a2651d867f63c"} Mar 18 12:35:49 crc kubenswrapper[4975]: I0318 12:35:49.812660 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e","Type":"ContainerStarted","Data":"9c9c27358654152e920831ce36019dd4442ef7b156e35d1397945b014ca71ed2"} Mar 18 12:35:49 crc kubenswrapper[4975]: I0318 12:35:49.813179 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:35:49 crc kubenswrapper[4975]: I0318 12:35:49.835079 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.559135068 podStartE2EDuration="6.835054605s" podCreationTimestamp="2026-03-18 12:35:43 +0000 UTC" firstStartedPulling="2026-03-18 12:35:44.755387531 +0000 UTC m=+1530.469788110" lastFinishedPulling="2026-03-18 12:35:49.031307068 +0000 UTC m=+1534.745707647" observedRunningTime="2026-03-18 12:35:49.830418677 +0000 UTC m=+1535.544819266" watchObservedRunningTime="2026-03-18 12:35:49.835054605 +0000 UTC m=+1535.549455184" Mar 18 12:35:50 crc kubenswrapper[4975]: I0318 12:35:50.098974 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 12:35:50 crc kubenswrapper[4975]: I0318 12:35:50.099997 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 12:35:50 crc kubenswrapper[4975]: I0318 12:35:50.150901 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 12:35:50 crc kubenswrapper[4975]: I0318 12:35:50.151633 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 12:35:50 crc kubenswrapper[4975]: I0318 12:35:50.820631 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 12:35:50 crc kubenswrapper[4975]: I0318 12:35:50.820963 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 12:35:51 crc kubenswrapper[4975]: I0318 12:35:51.306945 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 12:35:51 crc kubenswrapper[4975]: I0318 12:35:51.307012 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 12:35:51 crc kubenswrapper[4975]: I0318 12:35:51.354280 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 12:35:51 crc kubenswrapper[4975]: I0318 12:35:51.354744 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 12:35:51 crc kubenswrapper[4975]: I0318 12:35:51.830543 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 12:35:51 crc kubenswrapper[4975]: I0318 12:35:51.832235 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 12:35:52 crc kubenswrapper[4975]: I0318 12:35:52.994614 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 12:35:52 crc kubenswrapper[4975]: I0318 12:35:52.995304 4975 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:35:53 crc kubenswrapper[4975]: I0318 12:35:53.035565 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 12:35:53 crc kubenswrapper[4975]: I0318 12:35:53.990327 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 12:35:53 crc kubenswrapper[4975]: I0318 12:35:53.990590 4975 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 12:35:54 crc kubenswrapper[4975]: I0318 12:35:54.117032 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 12:35:54 crc kubenswrapper[4975]: I0318 12:35:54.269774 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 12:35:54 crc kubenswrapper[4975]: I0318 12:35:54.896946 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-x6ztn"] Mar 18 12:35:54 crc kubenswrapper[4975]: I0318 12:35:54.898132 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:35:54 crc kubenswrapper[4975]: I0318 12:35:54.900816 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 12:35:54 crc kubenswrapper[4975]: I0318 12:35:54.901093 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 12:35:54 crc kubenswrapper[4975]: I0318 12:35:54.914230 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-x6ztn"] Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.100704 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-config-data\") pod \"nova-cell0-cell-mapping-x6ztn\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.104570 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klmw8\" (UniqueName: \"kubernetes.io/projected/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-kube-api-access-klmw8\") pod \"nova-cell0-cell-mapping-x6ztn\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.104799 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-scripts\") pod \"nova-cell0-cell-mapping-x6ztn\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.105028 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x6ztn\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.162403 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.165184 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.173788 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.195084 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.206652 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/191e985b-8564-4dc3-b05d-c6e4fef796a8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"191e985b-8564-4dc3-b05d-c6e4fef796a8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.206702 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttcs5\" (UniqueName: \"kubernetes.io/projected/191e985b-8564-4dc3-b05d-c6e4fef796a8-kube-api-access-ttcs5\") pod \"nova-cell1-novncproxy-0\" (UID: \"191e985b-8564-4dc3-b05d-c6e4fef796a8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.206762 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-config-data\") pod \"nova-cell0-cell-mapping-x6ztn\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.206815 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klmw8\" (UniqueName: \"kubernetes.io/projected/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-kube-api-access-klmw8\") pod \"nova-cell0-cell-mapping-x6ztn\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.206889 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-scripts\") pod \"nova-cell0-cell-mapping-x6ztn\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.206960 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x6ztn\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.206986 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/191e985b-8564-4dc3-b05d-c6e4fef796a8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"191e985b-8564-4dc3-b05d-c6e4fef796a8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.226920 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.228643 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.236511 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.248910 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-config-data\") pod \"nova-cell0-cell-mapping-x6ztn\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.250710 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-scripts\") pod \"nova-cell0-cell-mapping-x6ztn\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.250753 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x6ztn\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.251402 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klmw8\" (UniqueName: \"kubernetes.io/projected/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-kube-api-access-klmw8\") pod \"nova-cell0-cell-mapping-x6ztn\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.311745 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/191e985b-8564-4dc3-b05d-c6e4fef796a8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"191e985b-8564-4dc3-b05d-c6e4fef796a8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.311795 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/191e985b-8564-4dc3-b05d-c6e4fef796a8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"191e985b-8564-4dc3-b05d-c6e4fef796a8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.311816 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttcs5\" (UniqueName: \"kubernetes.io/projected/191e985b-8564-4dc3-b05d-c6e4fef796a8-kube-api-access-ttcs5\") pod \"nova-cell1-novncproxy-0\" (UID: \"191e985b-8564-4dc3-b05d-c6e4fef796a8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.318772 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.324800 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/191e985b-8564-4dc3-b05d-c6e4fef796a8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"191e985b-8564-4dc3-b05d-c6e4fef796a8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.343798 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/191e985b-8564-4dc3-b05d-c6e4fef796a8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"191e985b-8564-4dc3-b05d-c6e4fef796a8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.344492 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttcs5\" (UniqueName: \"kubernetes.io/projected/191e985b-8564-4dc3-b05d-c6e4fef796a8-kube-api-access-ttcs5\") pod \"nova-cell1-novncproxy-0\" (UID: \"191e985b-8564-4dc3-b05d-c6e4fef796a8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.380690 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.382312 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.392436 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.424052 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91c2909b-ac35-4304-a0cb-571293d436e0-config-data\") pod \"nova-scheduler-0\" (UID: \"91c2909b-ac35-4304-a0cb-571293d436e0\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.424189 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkm44\" (UniqueName: \"kubernetes.io/projected/0cfaf200-320b-4866-91d3-0a526ad54da3-kube-api-access-rkm44\") pod \"nova-api-0\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " pod="openstack/nova-api-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.424255 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cfaf200-320b-4866-91d3-0a526ad54da3-config-data\") pod \"nova-api-0\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " pod="openstack/nova-api-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.424286 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cfaf200-320b-4866-91d3-0a526ad54da3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " pod="openstack/nova-api-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.424404 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cfaf200-320b-4866-91d3-0a526ad54da3-logs\") pod \"nova-api-0\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " pod="openstack/nova-api-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.424475 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl92p\" (UniqueName: \"kubernetes.io/projected/91c2909b-ac35-4304-a0cb-571293d436e0-kube-api-access-bl92p\") pod \"nova-scheduler-0\" (UID: \"91c2909b-ac35-4304-a0cb-571293d436e0\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.424556 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91c2909b-ac35-4304-a0cb-571293d436e0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"91c2909b-ac35-4304-a0cb-571293d436e0\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.425079 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.444227 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.518024 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.520389 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.522443 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.525269 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.527222 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91c2909b-ac35-4304-a0cb-571293d436e0-config-data\") pod \"nova-scheduler-0\" (UID: \"91c2909b-ac35-4304-a0cb-571293d436e0\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.527287 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkm44\" (UniqueName: \"kubernetes.io/projected/0cfaf200-320b-4866-91d3-0a526ad54da3-kube-api-access-rkm44\") pod \"nova-api-0\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " pod="openstack/nova-api-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.527698 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cfaf200-320b-4866-91d3-0a526ad54da3-config-data\") pod \"nova-api-0\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " pod="openstack/nova-api-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.527724 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cfaf200-320b-4866-91d3-0a526ad54da3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " pod="openstack/nova-api-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.527812 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cfaf200-320b-4866-91d3-0a526ad54da3-logs\") pod \"nova-api-0\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " pod="openstack/nova-api-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.527859 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl92p\" (UniqueName: \"kubernetes.io/projected/91c2909b-ac35-4304-a0cb-571293d436e0-kube-api-access-bl92p\") pod \"nova-scheduler-0\" (UID: \"91c2909b-ac35-4304-a0cb-571293d436e0\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.527938 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91c2909b-ac35-4304-a0cb-571293d436e0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"91c2909b-ac35-4304-a0cb-571293d436e0\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.528645 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cfaf200-320b-4866-91d3-0a526ad54da3-logs\") pod \"nova-api-0\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " pod="openstack/nova-api-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.538700 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.538762 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.539833 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cfaf200-320b-4866-91d3-0a526ad54da3-config-data\") pod \"nova-api-0\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " pod="openstack/nova-api-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.555622 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl92p\" (UniqueName: \"kubernetes.io/projected/91c2909b-ac35-4304-a0cb-571293d436e0-kube-api-access-bl92p\") pod \"nova-scheduler-0\" (UID: \"91c2909b-ac35-4304-a0cb-571293d436e0\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.561721 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkm44\" (UniqueName: \"kubernetes.io/projected/0cfaf200-320b-4866-91d3-0a526ad54da3-kube-api-access-rkm44\") pod \"nova-api-0\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " pod="openstack/nova-api-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.564413 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cfaf200-320b-4866-91d3-0a526ad54da3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " pod="openstack/nova-api-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.569080 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.570164 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91c2909b-ac35-4304-a0cb-571293d436e0-config-data\") pod \"nova-scheduler-0\" (UID: \"91c2909b-ac35-4304-a0cb-571293d436e0\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.572986 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91c2909b-ac35-4304-a0cb-571293d436e0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"91c2909b-ac35-4304-a0cb-571293d436e0\") " pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.622328 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-2zjm2"] Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.624133 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.629360 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-config\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.629406 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.629450 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.629476 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.629500 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66c4786f-4507-4cfc-af84-ac3bea8e6a57-logs\") pod \"nova-metadata-0\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.629516 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpksr\" (UniqueName: \"kubernetes.io/projected/66c4786f-4507-4cfc-af84-ac3bea8e6a57-kube-api-access-jpksr\") pod \"nova-metadata-0\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.629556 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-dns-svc\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.629576 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c4786f-4507-4cfc-af84-ac3bea8e6a57-config-data\") pod \"nova-metadata-0\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.629595 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrn9j\" (UniqueName: \"kubernetes.io/projected/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-kube-api-access-qrn9j\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.629620 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c4786f-4507-4cfc-af84-ac3bea8e6a57-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.657784 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-2zjm2"] Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.730933 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpksr\" (UniqueName: \"kubernetes.io/projected/66c4786f-4507-4cfc-af84-ac3bea8e6a57-kube-api-access-jpksr\") pod \"nova-metadata-0\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.731040 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-dns-svc\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.731076 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c4786f-4507-4cfc-af84-ac3bea8e6a57-config-data\") pod \"nova-metadata-0\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.731106 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrn9j\" (UniqueName: \"kubernetes.io/projected/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-kube-api-access-qrn9j\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.731145 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c4786f-4507-4cfc-af84-ac3bea8e6a57-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.731191 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-config\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.731252 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.731319 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.731360 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.731394 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66c4786f-4507-4cfc-af84-ac3bea8e6a57-logs\") pod \"nova-metadata-0\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.732215 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66c4786f-4507-4cfc-af84-ac3bea8e6a57-logs\") pod \"nova-metadata-0\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.733357 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.733984 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-config\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.734167 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-dns-svc\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.734250 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.734808 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.739551 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c4786f-4507-4cfc-af84-ac3bea8e6a57-config-data\") pod \"nova-metadata-0\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.744118 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c4786f-4507-4cfc-af84-ac3bea8e6a57-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.758723 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrn9j\" (UniqueName: \"kubernetes.io/projected/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-kube-api-access-qrn9j\") pod \"dnsmasq-dns-757b4f8459-2zjm2\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.765488 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpksr\" (UniqueName: \"kubernetes.io/projected/66c4786f-4507-4cfc-af84-ac3bea8e6a57-kube-api-access-jpksr\") pod \"nova-metadata-0\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.771520 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.825518 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.872068 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:35:55 crc kubenswrapper[4975]: I0318 12:35:55.968519 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.082499 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.199092 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-x6ztn"] Mar 18 12:35:56 crc kubenswrapper[4975]: W0318 12:35:56.219894 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b1267e7_5d6a_43f3_aa6b_e864972b60f6.slice/crio-5afda2c5bded96e63bc3c022cdaba97c7ed8cee0cf588f25dd5ca02aaa5375a8 WatchSource:0}: Error finding container 5afda2c5bded96e63bc3c022cdaba97c7ed8cee0cf588f25dd5ca02aaa5375a8: Status 404 returned error can't find the container with id 5afda2c5bded96e63bc3c022cdaba97c7ed8cee0cf588f25dd5ca02aaa5375a8 Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.414550 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c2rjz"] Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.415987 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.421544 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.421791 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.438961 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.470107 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-config-data\") pod \"nova-cell1-conductor-db-sync-c2rjz\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.470175 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-scripts\") pod \"nova-cell1-conductor-db-sync-c2rjz\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.470241 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c2rjz\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.470283 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbkk2\" (UniqueName: \"kubernetes.io/projected/bba9a490-9803-4bc8-a775-1d8711fd3b41-kube-api-access-sbkk2\") pod \"nova-cell1-conductor-db-sync-c2rjz\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.480530 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c2rjz"] Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.571616 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.571981 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-config-data\") pod \"nova-cell1-conductor-db-sync-c2rjz\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.572052 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-scripts\") pod \"nova-cell1-conductor-db-sync-c2rjz\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.572124 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c2rjz\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.572162 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbkk2\" (UniqueName: \"kubernetes.io/projected/bba9a490-9803-4bc8-a775-1d8711fd3b41-kube-api-access-sbkk2\") pod \"nova-cell1-conductor-db-sync-c2rjz\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.577446 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-scripts\") pod \"nova-cell1-conductor-db-sync-c2rjz\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.577652 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-config-data\") pod \"nova-cell1-conductor-db-sync-c2rjz\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.578543 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c2rjz\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.601176 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbkk2\" (UniqueName: \"kubernetes.io/projected/bba9a490-9803-4bc8-a775-1d8711fd3b41-kube-api-access-sbkk2\") pod \"nova-cell1-conductor-db-sync-c2rjz\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.775956 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-2zjm2"] Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.796827 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:35:56 crc kubenswrapper[4975]: I0318 12:35:56.806635 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:57 crc kubenswrapper[4975]: I0318 12:35:57.007151 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0cfaf200-320b-4866-91d3-0a526ad54da3","Type":"ContainerStarted","Data":"d99c392a0f5988c3a75860833b014dc15837eb9ce0bdcc621d0d940cdcf085e3"} Mar 18 12:35:57 crc kubenswrapper[4975]: I0318 12:35:57.008728 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"191e985b-8564-4dc3-b05d-c6e4fef796a8","Type":"ContainerStarted","Data":"54f401288b23a7c761a65a6d72f89dc093dd41d565aaa0e58c50933397792d23"} Mar 18 12:35:57 crc kubenswrapper[4975]: I0318 12:35:57.011851 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x6ztn" event={"ID":"1b1267e7-5d6a-43f3-aa6b-e864972b60f6","Type":"ContainerStarted","Data":"17303a59b3af9e610f1de5900d220c97ff01518d52ba803677b9a012c93e84fb"} Mar 18 12:35:57 crc kubenswrapper[4975]: I0318 12:35:57.011959 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x6ztn" event={"ID":"1b1267e7-5d6a-43f3-aa6b-e864972b60f6","Type":"ContainerStarted","Data":"5afda2c5bded96e63bc3c022cdaba97c7ed8cee0cf588f25dd5ca02aaa5375a8"} Mar 18 12:35:57 crc kubenswrapper[4975]: I0318 12:35:57.014653 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66c4786f-4507-4cfc-af84-ac3bea8e6a57","Type":"ContainerStarted","Data":"3ef10066fcd75fe01b77e3d4283c0b78fd988b158ca549879f9f41a0bc286413"} Mar 18 12:35:57 crc kubenswrapper[4975]: I0318 12:35:57.043811 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" event={"ID":"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd","Type":"ContainerStarted","Data":"64ea038b778e65d67ec0d490c320144965a0549968e02253bf9679160a5c75ae"} Mar 18 12:35:57 crc kubenswrapper[4975]: I0318 12:35:57.043879 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"91c2909b-ac35-4304-a0cb-571293d436e0","Type":"ContainerStarted","Data":"e8c9715915be5c9e20dd575ab7568ee1f4e5e14ab7d2970bb7783b01878011a8"} Mar 18 12:35:57 crc kubenswrapper[4975]: I0318 12:35:57.050385 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-x6ztn" podStartSLOduration=3.050274407 podStartE2EDuration="3.050274407s" podCreationTimestamp="2026-03-18 12:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:57.037120825 +0000 UTC m=+1542.751521434" watchObservedRunningTime="2026-03-18 12:35:57.050274407 +0000 UTC m=+1542.764674986" Mar 18 12:35:57 crc kubenswrapper[4975]: I0318 12:35:57.380944 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c2rjz"] Mar 18 12:35:57 crc kubenswrapper[4975]: W0318 12:35:57.405800 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbba9a490_9803_4bc8_a775_1d8711fd3b41.slice/crio-aa2fd702fe8c5670c6548dbeb426b1f3aca35a6166abecde2bc52165fd09d2b7 WatchSource:0}: Error finding container aa2fd702fe8c5670c6548dbeb426b1f3aca35a6166abecde2bc52165fd09d2b7: Status 404 returned error can't find the container with id aa2fd702fe8c5670c6548dbeb426b1f3aca35a6166abecde2bc52165fd09d2b7 Mar 18 12:35:58 crc kubenswrapper[4975]: I0318 12:35:58.058389 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c2rjz" event={"ID":"bba9a490-9803-4bc8-a775-1d8711fd3b41","Type":"ContainerStarted","Data":"a142026bbb15be57597ac55d56a54c57db2e165d7de332634d1a83cf4f8cfbc4"} Mar 18 12:35:58 crc kubenswrapper[4975]: I0318 12:35:58.058918 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c2rjz" event={"ID":"bba9a490-9803-4bc8-a775-1d8711fd3b41","Type":"ContainerStarted","Data":"aa2fd702fe8c5670c6548dbeb426b1f3aca35a6166abecde2bc52165fd09d2b7"} Mar 18 12:35:58 crc kubenswrapper[4975]: I0318 12:35:58.063710 4975 generic.go:334] "Generic (PLEG): container finished" podID="e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd" containerID="2e135f359094f15c73b161e76526f3aa998bcdb6c6ce3b283bcbdd3a87ce0ca0" exitCode=0 Mar 18 12:35:58 crc kubenswrapper[4975]: I0318 12:35:58.063789 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" event={"ID":"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd","Type":"ContainerDied","Data":"2e135f359094f15c73b161e76526f3aa998bcdb6c6ce3b283bcbdd3a87ce0ca0"} Mar 18 12:35:58 crc kubenswrapper[4975]: I0318 12:35:58.088103 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-c2rjz" podStartSLOduration=2.088082471 podStartE2EDuration="2.088082471s" podCreationTimestamp="2026-03-18 12:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:35:58.07790254 +0000 UTC m=+1543.792303129" watchObservedRunningTime="2026-03-18 12:35:58.088082471 +0000 UTC m=+1543.802483050" Mar 18 12:35:59 crc kubenswrapper[4975]: I0318 12:35:59.410410 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:35:59 crc kubenswrapper[4975]: I0318 12:35:59.423678 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:36:00 crc kubenswrapper[4975]: I0318 12:36:00.139728 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563956-lqdr4"] Mar 18 12:36:00 crc kubenswrapper[4975]: I0318 12:36:00.142761 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563956-lqdr4" Mar 18 12:36:00 crc kubenswrapper[4975]: I0318 12:36:00.147599 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:36:00 crc kubenswrapper[4975]: I0318 12:36:00.148008 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:36:00 crc kubenswrapper[4975]: I0318 12:36:00.148243 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:36:00 crc kubenswrapper[4975]: I0318 12:36:00.151040 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563956-lqdr4"] Mar 18 12:36:00 crc kubenswrapper[4975]: I0318 12:36:00.254235 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz665\" (UniqueName: \"kubernetes.io/projected/65f54532-e4a2-4f73-b254-aa7e0bd6a04e-kube-api-access-wz665\") pod \"auto-csr-approver-29563956-lqdr4\" (UID: \"65f54532-e4a2-4f73-b254-aa7e0bd6a04e\") " pod="openshift-infra/auto-csr-approver-29563956-lqdr4" Mar 18 12:36:00 crc kubenswrapper[4975]: I0318 12:36:00.368690 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz665\" (UniqueName: \"kubernetes.io/projected/65f54532-e4a2-4f73-b254-aa7e0bd6a04e-kube-api-access-wz665\") pod \"auto-csr-approver-29563956-lqdr4\" (UID: \"65f54532-e4a2-4f73-b254-aa7e0bd6a04e\") " pod="openshift-infra/auto-csr-approver-29563956-lqdr4" Mar 18 12:36:00 crc kubenswrapper[4975]: I0318 12:36:00.391659 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz665\" (UniqueName: \"kubernetes.io/projected/65f54532-e4a2-4f73-b254-aa7e0bd6a04e-kube-api-access-wz665\") pod \"auto-csr-approver-29563956-lqdr4\" (UID: \"65f54532-e4a2-4f73-b254-aa7e0bd6a04e\") " pod="openshift-infra/auto-csr-approver-29563956-lqdr4" Mar 18 12:36:00 crc kubenswrapper[4975]: I0318 12:36:00.513284 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563956-lqdr4" Mar 18 12:36:01 crc kubenswrapper[4975]: I0318 12:36:01.182250 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66c4786f-4507-4cfc-af84-ac3bea8e6a57","Type":"ContainerStarted","Data":"20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa"} Mar 18 12:36:01 crc kubenswrapper[4975]: I0318 12:36:01.276986 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" event={"ID":"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd","Type":"ContainerStarted","Data":"98d1dee7a74766150d3a8bb150e66b3661de612b333c11a037de178495b42ace"} Mar 18 12:36:01 crc kubenswrapper[4975]: I0318 12:36:01.277491 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:36:01 crc kubenswrapper[4975]: I0318 12:36:01.281343 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"91c2909b-ac35-4304-a0cb-571293d436e0","Type":"ContainerStarted","Data":"c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2"} Mar 18 12:36:01 crc kubenswrapper[4975]: I0318 12:36:01.289239 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0cfaf200-320b-4866-91d3-0a526ad54da3","Type":"ContainerStarted","Data":"b77e810d3b113694a9cd9bdd17555e4af68a6499707f894c9f4a3501b252d961"} Mar 18 12:36:01 crc kubenswrapper[4975]: I0318 12:36:01.291873 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"191e985b-8564-4dc3-b05d-c6e4fef796a8","Type":"ContainerStarted","Data":"1d1d6372172db9d73ca66a3f56330445f89ed334db52d142c62b65c78a7e7a11"} Mar 18 12:36:01 crc kubenswrapper[4975]: I0318 12:36:01.293259 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="191e985b-8564-4dc3-b05d-c6e4fef796a8" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://1d1d6372172db9d73ca66a3f56330445f89ed334db52d142c62b65c78a7e7a11" gracePeriod=30 Mar 18 12:36:01 crc kubenswrapper[4975]: I0318 12:36:01.308017 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" podStartSLOduration=6.307993064 podStartE2EDuration="6.307993064s" podCreationTimestamp="2026-03-18 12:35:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:36:01.3016844 +0000 UTC m=+1547.016084989" watchObservedRunningTime="2026-03-18 12:36:01.307993064 +0000 UTC m=+1547.022393643" Mar 18 12:36:01 crc kubenswrapper[4975]: I0318 12:36:01.334793 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563956-lqdr4"] Mar 18 12:36:01 crc kubenswrapper[4975]: I0318 12:36:01.342667 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.925499984 podStartE2EDuration="6.342629018s" podCreationTimestamp="2026-03-18 12:35:55 +0000 UTC" firstStartedPulling="2026-03-18 12:35:56.415592237 +0000 UTC m=+1542.129992816" lastFinishedPulling="2026-03-18 12:35:59.832721271 +0000 UTC m=+1545.547121850" observedRunningTime="2026-03-18 12:36:01.329094045 +0000 UTC m=+1547.043494624" watchObservedRunningTime="2026-03-18 12:36:01.342629018 +0000 UTC m=+1547.057029597" Mar 18 12:36:01 crc kubenswrapper[4975]: I0318 12:36:01.374386 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.661493252 podStartE2EDuration="6.374362032s" podCreationTimestamp="2026-03-18 12:35:55 +0000 UTC" firstStartedPulling="2026-03-18 12:35:56.120064547 +0000 UTC m=+1541.834465126" lastFinishedPulling="2026-03-18 12:35:59.832933327 +0000 UTC m=+1545.547333906" observedRunningTime="2026-03-18 12:36:01.357451456 +0000 UTC m=+1547.071852045" watchObservedRunningTime="2026-03-18 12:36:01.374362032 +0000 UTC m=+1547.088762611" Mar 18 12:36:02 crc kubenswrapper[4975]: I0318 12:36:02.334089 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66c4786f-4507-4cfc-af84-ac3bea8e6a57","Type":"ContainerStarted","Data":"7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0"} Mar 18 12:36:02 crc kubenswrapper[4975]: I0318 12:36:02.334510 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="66c4786f-4507-4cfc-af84-ac3bea8e6a57" containerName="nova-metadata-log" containerID="cri-o://20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa" gracePeriod=30 Mar 18 12:36:02 crc kubenswrapper[4975]: I0318 12:36:02.335189 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="66c4786f-4507-4cfc-af84-ac3bea8e6a57" containerName="nova-metadata-metadata" containerID="cri-o://7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0" gracePeriod=30 Mar 18 12:36:02 crc kubenswrapper[4975]: I0318 12:36:02.340024 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0cfaf200-320b-4866-91d3-0a526ad54da3","Type":"ContainerStarted","Data":"7ddb073d3c900d530fce42926de174d9036ac55e433112b07e6ea8022c4f02e0"} Mar 18 12:36:02 crc kubenswrapper[4975]: I0318 12:36:02.367720 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563956-lqdr4" event={"ID":"65f54532-e4a2-4f73-b254-aa7e0bd6a04e","Type":"ContainerStarted","Data":"ddcac0e0175101b0374e4663efe55ca5a429803035af5d0bfcaafbd8f6c60625"} Mar 18 12:36:02 crc kubenswrapper[4975]: I0318 12:36:02.399216 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.036656417 podStartE2EDuration="7.399184668s" podCreationTimestamp="2026-03-18 12:35:55 +0000 UTC" firstStartedPulling="2026-03-18 12:35:56.584944081 +0000 UTC m=+1542.299344660" lastFinishedPulling="2026-03-18 12:35:59.947472332 +0000 UTC m=+1545.661872911" observedRunningTime="2026-03-18 12:36:02.398093268 +0000 UTC m=+1548.112493847" watchObservedRunningTime="2026-03-18 12:36:02.399184668 +0000 UTC m=+1548.113585257" Mar 18 12:36:02 crc kubenswrapper[4975]: I0318 12:36:02.407399 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.385450994 podStartE2EDuration="7.407371104s" podCreationTimestamp="2026-03-18 12:35:55 +0000 UTC" firstStartedPulling="2026-03-18 12:35:56.810797781 +0000 UTC m=+1542.525198370" lastFinishedPulling="2026-03-18 12:35:59.832717901 +0000 UTC m=+1545.547118480" observedRunningTime="2026-03-18 12:36:02.368814852 +0000 UTC m=+1548.083215431" watchObservedRunningTime="2026-03-18 12:36:02.407371104 +0000 UTC m=+1548.121771683" Mar 18 12:36:02 crc kubenswrapper[4975]: I0318 12:36:02.966157 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.040623 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpksr\" (UniqueName: \"kubernetes.io/projected/66c4786f-4507-4cfc-af84-ac3bea8e6a57-kube-api-access-jpksr\") pod \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.040821 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c4786f-4507-4cfc-af84-ac3bea8e6a57-combined-ca-bundle\") pod \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.040966 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c4786f-4507-4cfc-af84-ac3bea8e6a57-config-data\") pod \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.041039 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66c4786f-4507-4cfc-af84-ac3bea8e6a57-logs\") pod \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\" (UID: \"66c4786f-4507-4cfc-af84-ac3bea8e6a57\") " Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.042475 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c4786f-4507-4cfc-af84-ac3bea8e6a57-logs" (OuterVolumeSpecName: "logs") pod "66c4786f-4507-4cfc-af84-ac3bea8e6a57" (UID: "66c4786f-4507-4cfc-af84-ac3bea8e6a57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.047902 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c4786f-4507-4cfc-af84-ac3bea8e6a57-kube-api-access-jpksr" (OuterVolumeSpecName: "kube-api-access-jpksr") pod "66c4786f-4507-4cfc-af84-ac3bea8e6a57" (UID: "66c4786f-4507-4cfc-af84-ac3bea8e6a57"). InnerVolumeSpecName "kube-api-access-jpksr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.077590 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c4786f-4507-4cfc-af84-ac3bea8e6a57-config-data" (OuterVolumeSpecName: "config-data") pod "66c4786f-4507-4cfc-af84-ac3bea8e6a57" (UID: "66c4786f-4507-4cfc-af84-ac3bea8e6a57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.088423 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c4786f-4507-4cfc-af84-ac3bea8e6a57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66c4786f-4507-4cfc-af84-ac3bea8e6a57" (UID: "66c4786f-4507-4cfc-af84-ac3bea8e6a57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.205391 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c4786f-4507-4cfc-af84-ac3bea8e6a57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.205428 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c4786f-4507-4cfc-af84-ac3bea8e6a57-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.205439 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66c4786f-4507-4cfc-af84-ac3bea8e6a57-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.205452 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpksr\" (UniqueName: \"kubernetes.io/projected/66c4786f-4507-4cfc-af84-ac3bea8e6a57-kube-api-access-jpksr\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.378250 4975 generic.go:334] "Generic (PLEG): container finished" podID="65f54532-e4a2-4f73-b254-aa7e0bd6a04e" containerID="9f4abdac8a295aa8a7aafa09479fde44299b6b5d719a5e7717f3048eee20c81c" exitCode=0 Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.378329 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563956-lqdr4" event={"ID":"65f54532-e4a2-4f73-b254-aa7e0bd6a04e","Type":"ContainerDied","Data":"9f4abdac8a295aa8a7aafa09479fde44299b6b5d719a5e7717f3048eee20c81c"} Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.380821 4975 generic.go:334] "Generic (PLEG): container finished" podID="66c4786f-4507-4cfc-af84-ac3bea8e6a57" containerID="7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0" exitCode=0 Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.380904 4975 generic.go:334] "Generic (PLEG): container finished" podID="66c4786f-4507-4cfc-af84-ac3bea8e6a57" containerID="20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa" exitCode=143 Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.381942 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.386980 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66c4786f-4507-4cfc-af84-ac3bea8e6a57","Type":"ContainerDied","Data":"7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0"} Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.387043 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66c4786f-4507-4cfc-af84-ac3bea8e6a57","Type":"ContainerDied","Data":"20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa"} Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.387056 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66c4786f-4507-4cfc-af84-ac3bea8e6a57","Type":"ContainerDied","Data":"3ef10066fcd75fe01b77e3d4283c0b78fd988b158ca549879f9f41a0bc286413"} Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.387090 4975 scope.go:117] "RemoveContainer" containerID="7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.432081 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.440361 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.448842 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:03 crc kubenswrapper[4975]: E0318 12:36:03.449275 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c4786f-4507-4cfc-af84-ac3bea8e6a57" containerName="nova-metadata-metadata" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.449290 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c4786f-4507-4cfc-af84-ac3bea8e6a57" containerName="nova-metadata-metadata" Mar 18 12:36:03 crc kubenswrapper[4975]: E0318 12:36:03.449301 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c4786f-4507-4cfc-af84-ac3bea8e6a57" containerName="nova-metadata-log" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.449306 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c4786f-4507-4cfc-af84-ac3bea8e6a57" containerName="nova-metadata-log" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.449493 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c4786f-4507-4cfc-af84-ac3bea8e6a57" containerName="nova-metadata-metadata" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.449505 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c4786f-4507-4cfc-af84-ac3bea8e6a57" containerName="nova-metadata-log" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.450359 4975 scope.go:117] "RemoveContainer" containerID="20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.450535 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.453064 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.453067 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.494757 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.532522 4975 scope.go:117] "RemoveContainer" containerID="7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0" Mar 18 12:36:03 crc kubenswrapper[4975]: E0318 12:36:03.533087 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0\": container with ID starting with 7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0 not found: ID does not exist" containerID="7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.533133 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0"} err="failed to get container status \"7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0\": rpc error: code = NotFound desc = could not find container \"7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0\": container with ID starting with 7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0 not found: ID does not exist" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.533157 4975 scope.go:117] "RemoveContainer" containerID="20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa" Mar 18 12:36:03 crc kubenswrapper[4975]: E0318 12:36:03.533459 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa\": container with ID starting with 20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa not found: ID does not exist" containerID="20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.533480 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa"} err="failed to get container status \"20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa\": rpc error: code = NotFound desc = could not find container \"20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa\": container with ID starting with 20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa not found: ID does not exist" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.533493 4975 scope.go:117] "RemoveContainer" containerID="7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.533740 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0"} err="failed to get container status \"7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0\": rpc error: code = NotFound desc = could not find container \"7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0\": container with ID starting with 7d3951632b86bc19f6cda40fba466cfd510320f8eb5fd5b97d7018cb94f0e9d0 not found: ID does not exist" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.533758 4975 scope.go:117] "RemoveContainer" containerID="20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.534037 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa"} err="failed to get container status \"20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa\": rpc error: code = NotFound desc = could not find container \"20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa\": container with ID starting with 20ec7d51cca3bac172ccd4d7cdd793cc0f8f245db030e259ccb06afad35292aa not found: ID does not exist" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.612404 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d190901-e195-45a6-8c3b-7542587656aa-logs\") pod \"nova-metadata-0\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.612484 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-config-data\") pod \"nova-metadata-0\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.612529 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.612576 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.612616 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftv4b\" (UniqueName: \"kubernetes.io/projected/8d190901-e195-45a6-8c3b-7542587656aa-kube-api-access-ftv4b\") pod \"nova-metadata-0\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.715544 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-config-data\") pod \"nova-metadata-0\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.715640 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.715720 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.715747 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftv4b\" (UniqueName: \"kubernetes.io/projected/8d190901-e195-45a6-8c3b-7542587656aa-kube-api-access-ftv4b\") pod \"nova-metadata-0\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.715974 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d190901-e195-45a6-8c3b-7542587656aa-logs\") pod \"nova-metadata-0\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.716640 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d190901-e195-45a6-8c3b-7542587656aa-logs\") pod \"nova-metadata-0\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.721192 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-config-data\") pod \"nova-metadata-0\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.723008 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.723268 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.740131 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftv4b\" (UniqueName: \"kubernetes.io/projected/8d190901-e195-45a6-8c3b-7542587656aa-kube-api-access-ftv4b\") pod \"nova-metadata-0\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " pod="openstack/nova-metadata-0" Mar 18 12:36:03 crc kubenswrapper[4975]: I0318 12:36:03.834576 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:36:04 crc kubenswrapper[4975]: I0318 12:36:04.284180 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:04 crc kubenswrapper[4975]: W0318 12:36:04.286530 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d190901_e195_45a6_8c3b_7542587656aa.slice/crio-cbac04ce68069018de19e3445c7b2c1b468b2edf0ee427b76aa1d077cea8d6b6 WatchSource:0}: Error finding container cbac04ce68069018de19e3445c7b2c1b468b2edf0ee427b76aa1d077cea8d6b6: Status 404 returned error can't find the container with id cbac04ce68069018de19e3445c7b2c1b468b2edf0ee427b76aa1d077cea8d6b6 Mar 18 12:36:04 crc kubenswrapper[4975]: I0318 12:36:04.397681 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d190901-e195-45a6-8c3b-7542587656aa","Type":"ContainerStarted","Data":"cbac04ce68069018de19e3445c7b2c1b468b2edf0ee427b76aa1d077cea8d6b6"} Mar 18 12:36:04 crc kubenswrapper[4975]: I0318 12:36:04.727616 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563956-lqdr4" Mar 18 12:36:04 crc kubenswrapper[4975]: I0318 12:36:04.865635 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz665\" (UniqueName: \"kubernetes.io/projected/65f54532-e4a2-4f73-b254-aa7e0bd6a04e-kube-api-access-wz665\") pod \"65f54532-e4a2-4f73-b254-aa7e0bd6a04e\" (UID: \"65f54532-e4a2-4f73-b254-aa7e0bd6a04e\") " Mar 18 12:36:04 crc kubenswrapper[4975]: I0318 12:36:04.872306 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f54532-e4a2-4f73-b254-aa7e0bd6a04e-kube-api-access-wz665" (OuterVolumeSpecName: "kube-api-access-wz665") pod "65f54532-e4a2-4f73-b254-aa7e0bd6a04e" (UID: "65f54532-e4a2-4f73-b254-aa7e0bd6a04e"). InnerVolumeSpecName "kube-api-access-wz665". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:04 crc kubenswrapper[4975]: I0318 12:36:04.968013 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz665\" (UniqueName: \"kubernetes.io/projected/65f54532-e4a2-4f73-b254-aa7e0bd6a04e-kube-api-access-wz665\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.030664 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c4786f-4507-4cfc-af84-ac3bea8e6a57" path="/var/lib/kubelet/pods/66c4786f-4507-4cfc-af84-ac3bea8e6a57/volumes" Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.409322 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563956-lqdr4" event={"ID":"65f54532-e4a2-4f73-b254-aa7e0bd6a04e","Type":"ContainerDied","Data":"ddcac0e0175101b0374e4663efe55ca5a429803035af5d0bfcaafbd8f6c60625"} Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.409374 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddcac0e0175101b0374e4663efe55ca5a429803035af5d0bfcaafbd8f6c60625" Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.409446 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563956-lqdr4" Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.412468 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d190901-e195-45a6-8c3b-7542587656aa","Type":"ContainerStarted","Data":"353f8c333ee29368bff77e13914765290f76dbed49661ea9b0e7d3125c98182e"} Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.412537 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d190901-e195-45a6-8c3b-7542587656aa","Type":"ContainerStarted","Data":"ee4854d43fe726503e83753624561106a5422ef10509bf96eb98c37526fc5a29"} Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.433000 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.432976505 podStartE2EDuration="2.432976505s" podCreationTimestamp="2026-03-18 12:36:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:36:05.427697419 +0000 UTC m=+1551.142097998" watchObservedRunningTime="2026-03-18 12:36:05.432976505 +0000 UTC m=+1551.147377084" Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.447673 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.773629 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.773784 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.814312 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563950-4n242"] Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.822256 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563950-4n242"] Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.826537 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.826785 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.826805 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:36:05 crc kubenswrapper[4975]: I0318 12:36:05.971078 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.062243 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-6427m"] Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.063552 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" podUID="1d5e1038-c27a-4d75-bce4-997028e690eb" containerName="dnsmasq-dns" containerID="cri-o://009864e170d37ee943f4531cd6bc15601a316a6769327113cb2eaced6d4816b6" gracePeriod=10 Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.427683 4975 generic.go:334] "Generic (PLEG): container finished" podID="1d5e1038-c27a-4d75-bce4-997028e690eb" containerID="009864e170d37ee943f4531cd6bc15601a316a6769327113cb2eaced6d4816b6" exitCode=0 Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.427763 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" event={"ID":"1d5e1038-c27a-4d75-bce4-997028e690eb","Type":"ContainerDied","Data":"009864e170d37ee943f4531cd6bc15601a316a6769327113cb2eaced6d4816b6"} Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.429789 4975 generic.go:334] "Generic (PLEG): container finished" podID="1b1267e7-5d6a-43f3-aa6b-e864972b60f6" containerID="17303a59b3af9e610f1de5900d220c97ff01518d52ba803677b9a012c93e84fb" exitCode=0 Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.430010 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x6ztn" event={"ID":"1b1267e7-5d6a-43f3-aa6b-e864972b60f6","Type":"ContainerDied","Data":"17303a59b3af9e610f1de5900d220c97ff01518d52ba803677b9a012c93e84fb"} Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.471337 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.578387 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.632899 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k566f\" (UniqueName: \"kubernetes.io/projected/1d5e1038-c27a-4d75-bce4-997028e690eb-kube-api-access-k566f\") pod \"1d5e1038-c27a-4d75-bce4-997028e690eb\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.632940 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-ovsdbserver-sb\") pod \"1d5e1038-c27a-4d75-bce4-997028e690eb\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.633089 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-dns-swift-storage-0\") pod \"1d5e1038-c27a-4d75-bce4-997028e690eb\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.633109 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-dns-svc\") pod \"1d5e1038-c27a-4d75-bce4-997028e690eb\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.633159 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-ovsdbserver-nb\") pod \"1d5e1038-c27a-4d75-bce4-997028e690eb\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.633187 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-config\") pod \"1d5e1038-c27a-4d75-bce4-997028e690eb\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.667485 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d5e1038-c27a-4d75-bce4-997028e690eb-kube-api-access-k566f" (OuterVolumeSpecName: "kube-api-access-k566f") pod "1d5e1038-c27a-4d75-bce4-997028e690eb" (UID: "1d5e1038-c27a-4d75-bce4-997028e690eb"). InnerVolumeSpecName "kube-api-access-k566f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.708520 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1d5e1038-c27a-4d75-bce4-997028e690eb" (UID: "1d5e1038-c27a-4d75-bce4-997028e690eb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.709986 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1d5e1038-c27a-4d75-bce4-997028e690eb" (UID: "1d5e1038-c27a-4d75-bce4-997028e690eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.716348 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-config" (OuterVolumeSpecName: "config") pod "1d5e1038-c27a-4d75-bce4-997028e690eb" (UID: "1d5e1038-c27a-4d75-bce4-997028e690eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.727081 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1d5e1038-c27a-4d75-bce4-997028e690eb" (UID: "1d5e1038-c27a-4d75-bce4-997028e690eb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.734632 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1d5e1038-c27a-4d75-bce4-997028e690eb" (UID: "1d5e1038-c27a-4d75-bce4-997028e690eb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.734909 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-dns-swift-storage-0\") pod \"1d5e1038-c27a-4d75-bce4-997028e690eb\" (UID: \"1d5e1038-c27a-4d75-bce4-997028e690eb\") " Mar 18 12:36:06 crc kubenswrapper[4975]: W0318 12:36:06.735077 4975 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1d5e1038-c27a-4d75-bce4-997028e690eb/volumes/kubernetes.io~configmap/dns-swift-storage-0 Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.735095 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1d5e1038-c27a-4d75-bce4-997028e690eb" (UID: "1d5e1038-c27a-4d75-bce4-997028e690eb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.736002 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.736025 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.736038 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k566f\" (UniqueName: \"kubernetes.io/projected/1d5e1038-c27a-4d75-bce4-997028e690eb-kube-api-access-k566f\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.736057 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.736068 4975 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.736079 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d5e1038-c27a-4d75-bce4-997028e690eb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.868211 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0cfaf200-320b-4866-91d3-0a526ad54da3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:36:06 crc kubenswrapper[4975]: I0318 12:36:06.868212 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0cfaf200-320b-4866-91d3-0a526ad54da3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.055935 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28462947-5e88-43df-a70b-1c4e5b99215c" path="/var/lib/kubelet/pods/28462947-5e88-43df-a70b-1c4e5b99215c/volumes" Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.442522 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" event={"ID":"1d5e1038-c27a-4d75-bce4-997028e690eb","Type":"ContainerDied","Data":"2958a36efedc27043120bb88e4de2b97fbf01809ec971065fd5b0f97b7e2cbd7"} Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.442596 4975 scope.go:117] "RemoveContainer" containerID="009864e170d37ee943f4531cd6bc15601a316a6769327113cb2eaced6d4816b6" Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.442750 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-6427m" Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.447722 4975 generic.go:334] "Generic (PLEG): container finished" podID="bba9a490-9803-4bc8-a775-1d8711fd3b41" containerID="a142026bbb15be57597ac55d56a54c57db2e165d7de332634d1a83cf4f8cfbc4" exitCode=0 Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.447801 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c2rjz" event={"ID":"bba9a490-9803-4bc8-a775-1d8711fd3b41","Type":"ContainerDied","Data":"a142026bbb15be57597ac55d56a54c57db2e165d7de332634d1a83cf4f8cfbc4"} Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.490145 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-6427m"] Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.505739 4975 scope.go:117] "RemoveContainer" containerID="69684312662f8c0fcd5275e7a89baece158ab301af1491394963ecf7107944d1" Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.512607 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-6427m"] Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.842411 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.958947 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-combined-ca-bundle\") pod \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.959020 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-scripts\") pod \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.959094 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-config-data\") pod \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.959206 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klmw8\" (UniqueName: \"kubernetes.io/projected/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-kube-api-access-klmw8\") pod \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\" (UID: \"1b1267e7-5d6a-43f3-aa6b-e864972b60f6\") " Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.963712 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-kube-api-access-klmw8" (OuterVolumeSpecName: "kube-api-access-klmw8") pod "1b1267e7-5d6a-43f3-aa6b-e864972b60f6" (UID: "1b1267e7-5d6a-43f3-aa6b-e864972b60f6"). InnerVolumeSpecName "kube-api-access-klmw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.979876 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-scripts" (OuterVolumeSpecName: "scripts") pod "1b1267e7-5d6a-43f3-aa6b-e864972b60f6" (UID: "1b1267e7-5d6a-43f3-aa6b-e864972b60f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.992059 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-config-data" (OuterVolumeSpecName: "config-data") pod "1b1267e7-5d6a-43f3-aa6b-e864972b60f6" (UID: "1b1267e7-5d6a-43f3-aa6b-e864972b60f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:07 crc kubenswrapper[4975]: I0318 12:36:07.992261 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b1267e7-5d6a-43f3-aa6b-e864972b60f6" (UID: "1b1267e7-5d6a-43f3-aa6b-e864972b60f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:08 crc kubenswrapper[4975]: I0318 12:36:08.061341 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klmw8\" (UniqueName: \"kubernetes.io/projected/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-kube-api-access-klmw8\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:08 crc kubenswrapper[4975]: I0318 12:36:08.061389 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:08 crc kubenswrapper[4975]: I0318 12:36:08.061405 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:08 crc kubenswrapper[4975]: I0318 12:36:08.061418 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b1267e7-5d6a-43f3-aa6b-e864972b60f6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:08 crc kubenswrapper[4975]: I0318 12:36:08.464576 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x6ztn" Mar 18 12:36:08 crc kubenswrapper[4975]: I0318 12:36:08.465962 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x6ztn" event={"ID":"1b1267e7-5d6a-43f3-aa6b-e864972b60f6","Type":"ContainerDied","Data":"5afda2c5bded96e63bc3c022cdaba97c7ed8cee0cf588f25dd5ca02aaa5375a8"} Mar 18 12:36:08 crc kubenswrapper[4975]: I0318 12:36:08.466016 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5afda2c5bded96e63bc3c022cdaba97c7ed8cee0cf588f25dd5ca02aaa5375a8" Mar 18 12:36:08 crc kubenswrapper[4975]: I0318 12:36:08.636238 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:08 crc kubenswrapper[4975]: I0318 12:36:08.636494 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0cfaf200-320b-4866-91d3-0a526ad54da3" containerName="nova-api-log" containerID="cri-o://b77e810d3b113694a9cd9bdd17555e4af68a6499707f894c9f4a3501b252d961" gracePeriod=30 Mar 18 12:36:08 crc kubenswrapper[4975]: I0318 12:36:08.636571 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0cfaf200-320b-4866-91d3-0a526ad54da3" containerName="nova-api-api" containerID="cri-o://7ddb073d3c900d530fce42926de174d9036ac55e433112b07e6ea8022c4f02e0" gracePeriod=30 Mar 18 12:36:08 crc kubenswrapper[4975]: I0318 12:36:08.652981 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:36:08 crc kubenswrapper[4975]: I0318 12:36:08.713575 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:08 crc kubenswrapper[4975]: I0318 12:36:08.713899 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8d190901-e195-45a6-8c3b-7542587656aa" containerName="nova-metadata-log" containerID="cri-o://ee4854d43fe726503e83753624561106a5422ef10509bf96eb98c37526fc5a29" gracePeriod=30 Mar 18 12:36:08 crc kubenswrapper[4975]: I0318 12:36:08.714033 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8d190901-e195-45a6-8c3b-7542587656aa" containerName="nova-metadata-metadata" containerID="cri-o://353f8c333ee29368bff77e13914765290f76dbed49661ea9b0e7d3125c98182e" gracePeriod=30 Mar 18 12:36:08 crc kubenswrapper[4975]: E0318 12:36:08.778682 4975 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cfaf200_320b_4866_91d3_0a526ad54da3.slice/crio-b77e810d3b113694a9cd9bdd17555e4af68a6499707f894c9f4a3501b252d961.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d190901_e195_45a6_8c3b_7542587656aa.slice/crio-conmon-ee4854d43fe726503e83753624561106a5422ef10509bf96eb98c37526fc5a29.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cfaf200_320b_4866_91d3_0a526ad54da3.slice/crio-conmon-b77e810d3b113694a9cd9bdd17555e4af68a6499707f894c9f4a3501b252d961.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:36:08 crc kubenswrapper[4975]: I0318 12:36:08.913817 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.025978 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d5e1038-c27a-4d75-bce4-997028e690eb" path="/var/lib/kubelet/pods/1d5e1038-c27a-4d75-bce4-997028e690eb/volumes" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.085115 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-scripts\") pod \"bba9a490-9803-4bc8-a775-1d8711fd3b41\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.085547 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbkk2\" (UniqueName: \"kubernetes.io/projected/bba9a490-9803-4bc8-a775-1d8711fd3b41-kube-api-access-sbkk2\") pod \"bba9a490-9803-4bc8-a775-1d8711fd3b41\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.085640 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-combined-ca-bundle\") pod \"bba9a490-9803-4bc8-a775-1d8711fd3b41\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.085783 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-config-data\") pod \"bba9a490-9803-4bc8-a775-1d8711fd3b41\" (UID: \"bba9a490-9803-4bc8-a775-1d8711fd3b41\") " Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.093022 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bba9a490-9803-4bc8-a775-1d8711fd3b41-kube-api-access-sbkk2" (OuterVolumeSpecName: "kube-api-access-sbkk2") pod "bba9a490-9803-4bc8-a775-1d8711fd3b41" (UID: "bba9a490-9803-4bc8-a775-1d8711fd3b41"). InnerVolumeSpecName "kube-api-access-sbkk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.093056 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-scripts" (OuterVolumeSpecName: "scripts") pod "bba9a490-9803-4bc8-a775-1d8711fd3b41" (UID: "bba9a490-9803-4bc8-a775-1d8711fd3b41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.115436 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bba9a490-9803-4bc8-a775-1d8711fd3b41" (UID: "bba9a490-9803-4bc8-a775-1d8711fd3b41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.117097 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-config-data" (OuterVolumeSpecName: "config-data") pod "bba9a490-9803-4bc8-a775-1d8711fd3b41" (UID: "bba9a490-9803-4bc8-a775-1d8711fd3b41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.188345 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbkk2\" (UniqueName: \"kubernetes.io/projected/bba9a490-9803-4bc8-a775-1d8711fd3b41-kube-api-access-sbkk2\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.188389 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.188404 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.188414 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bba9a490-9803-4bc8-a775-1d8711fd3b41-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.483739 4975 generic.go:334] "Generic (PLEG): container finished" podID="8d190901-e195-45a6-8c3b-7542587656aa" containerID="353f8c333ee29368bff77e13914765290f76dbed49661ea9b0e7d3125c98182e" exitCode=0 Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.483774 4975 generic.go:334] "Generic (PLEG): container finished" podID="8d190901-e195-45a6-8c3b-7542587656aa" containerID="ee4854d43fe726503e83753624561106a5422ef10509bf96eb98c37526fc5a29" exitCode=143 Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.483812 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d190901-e195-45a6-8c3b-7542587656aa","Type":"ContainerDied","Data":"353f8c333ee29368bff77e13914765290f76dbed49661ea9b0e7d3125c98182e"} Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.483844 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d190901-e195-45a6-8c3b-7542587656aa","Type":"ContainerDied","Data":"ee4854d43fe726503e83753624561106a5422ef10509bf96eb98c37526fc5a29"} Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.485378 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c2rjz" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.485380 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c2rjz" event={"ID":"bba9a490-9803-4bc8-a775-1d8711fd3b41","Type":"ContainerDied","Data":"aa2fd702fe8c5670c6548dbeb426b1f3aca35a6166abecde2bc52165fd09d2b7"} Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.485436 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa2fd702fe8c5670c6548dbeb426b1f3aca35a6166abecde2bc52165fd09d2b7" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.486985 4975 generic.go:334] "Generic (PLEG): container finished" podID="0cfaf200-320b-4866-91d3-0a526ad54da3" containerID="b77e810d3b113694a9cd9bdd17555e4af68a6499707f894c9f4a3501b252d961" exitCode=143 Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.487055 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0cfaf200-320b-4866-91d3-0a526ad54da3","Type":"ContainerDied","Data":"b77e810d3b113694a9cd9bdd17555e4af68a6499707f894c9f4a3501b252d961"} Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.487184 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="91c2909b-ac35-4304-a0cb-571293d436e0" containerName="nova-scheduler-scheduler" containerID="cri-o://c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2" gracePeriod=30 Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.537529 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.590401 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 12:36:09 crc kubenswrapper[4975]: E0318 12:36:09.590907 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d190901-e195-45a6-8c3b-7542587656aa" containerName="nova-metadata-log" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.590931 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d190901-e195-45a6-8c3b-7542587656aa" containerName="nova-metadata-log" Mar 18 12:36:09 crc kubenswrapper[4975]: E0318 12:36:09.590949 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d190901-e195-45a6-8c3b-7542587656aa" containerName="nova-metadata-metadata" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.590957 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d190901-e195-45a6-8c3b-7542587656aa" containerName="nova-metadata-metadata" Mar 18 12:36:09 crc kubenswrapper[4975]: E0318 12:36:09.590971 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f54532-e4a2-4f73-b254-aa7e0bd6a04e" containerName="oc" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.590981 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f54532-e4a2-4f73-b254-aa7e0bd6a04e" containerName="oc" Mar 18 12:36:09 crc kubenswrapper[4975]: E0318 12:36:09.591004 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d5e1038-c27a-4d75-bce4-997028e690eb" containerName="dnsmasq-dns" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.591013 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d5e1038-c27a-4d75-bce4-997028e690eb" containerName="dnsmasq-dns" Mar 18 12:36:09 crc kubenswrapper[4975]: E0318 12:36:09.591029 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d5e1038-c27a-4d75-bce4-997028e690eb" containerName="init" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.591037 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d5e1038-c27a-4d75-bce4-997028e690eb" containerName="init" Mar 18 12:36:09 crc kubenswrapper[4975]: E0318 12:36:09.591054 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1267e7-5d6a-43f3-aa6b-e864972b60f6" containerName="nova-manage" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.591063 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1267e7-5d6a-43f3-aa6b-e864972b60f6" containerName="nova-manage" Mar 18 12:36:09 crc kubenswrapper[4975]: E0318 12:36:09.591095 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bba9a490-9803-4bc8-a775-1d8711fd3b41" containerName="nova-cell1-conductor-db-sync" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.591105 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="bba9a490-9803-4bc8-a775-1d8711fd3b41" containerName="nova-cell1-conductor-db-sync" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.591323 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d190901-e195-45a6-8c3b-7542587656aa" containerName="nova-metadata-log" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.591341 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d5e1038-c27a-4d75-bce4-997028e690eb" containerName="dnsmasq-dns" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.601973 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f54532-e4a2-4f73-b254-aa7e0bd6a04e" containerName="oc" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.602029 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b1267e7-5d6a-43f3-aa6b-e864972b60f6" containerName="nova-manage" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.602068 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d190901-e195-45a6-8c3b-7542587656aa" containerName="nova-metadata-metadata" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.602081 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="bba9a490-9803-4bc8-a775-1d8711fd3b41" containerName="nova-cell1-conductor-db-sync" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.600647 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-combined-ca-bundle\") pod \"8d190901-e195-45a6-8c3b-7542587656aa\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.602629 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d190901-e195-45a6-8c3b-7542587656aa-logs\") pod \"8d190901-e195-45a6-8c3b-7542587656aa\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.603143 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.603177 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftv4b\" (UniqueName: \"kubernetes.io/projected/8d190901-e195-45a6-8c3b-7542587656aa-kube-api-access-ftv4b\") pod \"8d190901-e195-45a6-8c3b-7542587656aa\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.603239 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-config-data\") pod \"8d190901-e195-45a6-8c3b-7542587656aa\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.603372 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-nova-metadata-tls-certs\") pod \"8d190901-e195-45a6-8c3b-7542587656aa\" (UID: \"8d190901-e195-45a6-8c3b-7542587656aa\") " Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.603170 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d190901-e195-45a6-8c3b-7542587656aa-logs" (OuterVolumeSpecName: "logs") pod "8d190901-e195-45a6-8c3b-7542587656aa" (UID: "8d190901-e195-45a6-8c3b-7542587656aa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.604226 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d190901-e195-45a6-8c3b-7542587656aa-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.606934 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.609122 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d190901-e195-45a6-8c3b-7542587656aa-kube-api-access-ftv4b" (OuterVolumeSpecName: "kube-api-access-ftv4b") pod "8d190901-e195-45a6-8c3b-7542587656aa" (UID: "8d190901-e195-45a6-8c3b-7542587656aa"). InnerVolumeSpecName "kube-api-access-ftv4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.609341 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.639498 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d190901-e195-45a6-8c3b-7542587656aa" (UID: "8d190901-e195-45a6-8c3b-7542587656aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.641227 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-config-data" (OuterVolumeSpecName: "config-data") pod "8d190901-e195-45a6-8c3b-7542587656aa" (UID: "8d190901-e195-45a6-8c3b-7542587656aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.675140 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8d190901-e195-45a6-8c3b-7542587656aa" (UID: "8d190901-e195-45a6-8c3b-7542587656aa"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.705455 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca44478-1a7f-4e96-9025-5cdac5e90dcc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8ca44478-1a7f-4e96-9025-5cdac5e90dcc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.705566 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psgdb\" (UniqueName: \"kubernetes.io/projected/8ca44478-1a7f-4e96-9025-5cdac5e90dcc-kube-api-access-psgdb\") pod \"nova-cell1-conductor-0\" (UID: \"8ca44478-1a7f-4e96-9025-5cdac5e90dcc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.705789 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca44478-1a7f-4e96-9025-5cdac5e90dcc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8ca44478-1a7f-4e96-9025-5cdac5e90dcc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.705915 4975 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.705934 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.705945 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftv4b\" (UniqueName: \"kubernetes.io/projected/8d190901-e195-45a6-8c3b-7542587656aa-kube-api-access-ftv4b\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.705957 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d190901-e195-45a6-8c3b-7542587656aa-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.807085 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca44478-1a7f-4e96-9025-5cdac5e90dcc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8ca44478-1a7f-4e96-9025-5cdac5e90dcc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.807209 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca44478-1a7f-4e96-9025-5cdac5e90dcc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8ca44478-1a7f-4e96-9025-5cdac5e90dcc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.807276 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psgdb\" (UniqueName: \"kubernetes.io/projected/8ca44478-1a7f-4e96-9025-5cdac5e90dcc-kube-api-access-psgdb\") pod \"nova-cell1-conductor-0\" (UID: \"8ca44478-1a7f-4e96-9025-5cdac5e90dcc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.811029 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca44478-1a7f-4e96-9025-5cdac5e90dcc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8ca44478-1a7f-4e96-9025-5cdac5e90dcc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.813383 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca44478-1a7f-4e96-9025-5cdac5e90dcc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8ca44478-1a7f-4e96-9025-5cdac5e90dcc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.825653 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psgdb\" (UniqueName: \"kubernetes.io/projected/8ca44478-1a7f-4e96-9025-5cdac5e90dcc-kube-api-access-psgdb\") pod \"nova-cell1-conductor-0\" (UID: \"8ca44478-1a7f-4e96-9025-5cdac5e90dcc\") " pod="openstack/nova-cell1-conductor-0" Mar 18 12:36:09 crc kubenswrapper[4975]: I0318 12:36:09.927011 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.374125 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.495309 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8ca44478-1a7f-4e96-9025-5cdac5e90dcc","Type":"ContainerStarted","Data":"49daa2f2a350b6c5c5d21a11888215f5101456342e4ee37dd6162a1a65c88f5f"} Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.497062 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d190901-e195-45a6-8c3b-7542587656aa","Type":"ContainerDied","Data":"cbac04ce68069018de19e3445c7b2c1b468b2edf0ee427b76aa1d077cea8d6b6"} Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.497107 4975 scope.go:117] "RemoveContainer" containerID="353f8c333ee29368bff77e13914765290f76dbed49661ea9b0e7d3125c98182e" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.497145 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.524339 4975 scope.go:117] "RemoveContainer" containerID="ee4854d43fe726503e83753624561106a5422ef10509bf96eb98c37526fc5a29" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.536650 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.548887 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.562399 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.572346 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.575189 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.575273 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.586037 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.622079 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/790e3570-2044-4dea-b894-5222e6d3d2e9-logs\") pod \"nova-metadata-0\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.622454 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.622638 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.622880 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-config-data\") pod \"nova-metadata-0\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.623027 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmczl\" (UniqueName: \"kubernetes.io/projected/790e3570-2044-4dea-b894-5222e6d3d2e9-kube-api-access-rmczl\") pod \"nova-metadata-0\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.725763 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/790e3570-2044-4dea-b894-5222e6d3d2e9-logs\") pod \"nova-metadata-0\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.725844 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.725929 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.725993 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-config-data\") pod \"nova-metadata-0\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.726054 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmczl\" (UniqueName: \"kubernetes.io/projected/790e3570-2044-4dea-b894-5222e6d3d2e9-kube-api-access-rmczl\") pod \"nova-metadata-0\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.726531 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/790e3570-2044-4dea-b894-5222e6d3d2e9-logs\") pod \"nova-metadata-0\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.732765 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-config-data\") pod \"nova-metadata-0\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.732836 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.734547 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.758244 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmczl\" (UniqueName: \"kubernetes.io/projected/790e3570-2044-4dea-b894-5222e6d3d2e9-kube-api-access-rmczl\") pod \"nova-metadata-0\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " pod="openstack/nova-metadata-0" Mar 18 12:36:10 crc kubenswrapper[4975]: E0318 12:36:10.776630 4975 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:36:10 crc kubenswrapper[4975]: E0318 12:36:10.778580 4975 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:36:10 crc kubenswrapper[4975]: E0318 12:36:10.780325 4975 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 12:36:10 crc kubenswrapper[4975]: E0318 12:36:10.780368 4975 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="91c2909b-ac35-4304-a0cb-571293d436e0" containerName="nova-scheduler-scheduler" Mar 18 12:36:10 crc kubenswrapper[4975]: I0318 12:36:10.898278 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:36:11 crc kubenswrapper[4975]: I0318 12:36:11.035668 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d190901-e195-45a6-8c3b-7542587656aa" path="/var/lib/kubelet/pods/8d190901-e195-45a6-8c3b-7542587656aa/volumes" Mar 18 12:36:11 crc kubenswrapper[4975]: I0318 12:36:11.362148 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:11 crc kubenswrapper[4975]: W0318 12:36:11.369750 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod790e3570_2044_4dea_b894_5222e6d3d2e9.slice/crio-cca40968b1a608371c8dc90139e095d952c4e0f8f2008aba370817e169bb1c99 WatchSource:0}: Error finding container cca40968b1a608371c8dc90139e095d952c4e0f8f2008aba370817e169bb1c99: Status 404 returned error can't find the container with id cca40968b1a608371c8dc90139e095d952c4e0f8f2008aba370817e169bb1c99 Mar 18 12:36:11 crc kubenswrapper[4975]: I0318 12:36:11.507364 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8ca44478-1a7f-4e96-9025-5cdac5e90dcc","Type":"ContainerStarted","Data":"d7c626152a20dfdd036a020e2c4f2c01ca2550f6b7371649abd7d7b1de4b37d4"} Mar 18 12:36:11 crc kubenswrapper[4975]: I0318 12:36:11.507523 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 12:36:11 crc kubenswrapper[4975]: I0318 12:36:11.510081 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"790e3570-2044-4dea-b894-5222e6d3d2e9","Type":"ContainerStarted","Data":"cca40968b1a608371c8dc90139e095d952c4e0f8f2008aba370817e169bb1c99"} Mar 18 12:36:11 crc kubenswrapper[4975]: I0318 12:36:11.526086 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.525111105 podStartE2EDuration="2.525111105s" podCreationTimestamp="2026-03-18 12:36:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:36:11.523568002 +0000 UTC m=+1557.237968581" watchObservedRunningTime="2026-03-18 12:36:11.525111105 +0000 UTC m=+1557.239511684" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.455833 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.527215 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"790e3570-2044-4dea-b894-5222e6d3d2e9","Type":"ContainerStarted","Data":"06acc6bac297a959033832efd94492f6989840385401550230eda0a70307a721"} Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.527263 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"790e3570-2044-4dea-b894-5222e6d3d2e9","Type":"ContainerStarted","Data":"b3807fde9eac2ceff5338dd8ffff1452492c8ee2dff35186198ee6e05ee13616"} Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.530232 4975 generic.go:334] "Generic (PLEG): container finished" podID="0cfaf200-320b-4866-91d3-0a526ad54da3" containerID="7ddb073d3c900d530fce42926de174d9036ac55e433112b07e6ea8022c4f02e0" exitCode=0 Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.530303 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0cfaf200-320b-4866-91d3-0a526ad54da3","Type":"ContainerDied","Data":"7ddb073d3c900d530fce42926de174d9036ac55e433112b07e6ea8022c4f02e0"} Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.530346 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0cfaf200-320b-4866-91d3-0a526ad54da3","Type":"ContainerDied","Data":"d99c392a0f5988c3a75860833b014dc15837eb9ce0bdcc621d0d940cdcf085e3"} Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.530311 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.530370 4975 scope.go:117] "RemoveContainer" containerID="7ddb073d3c900d530fce42926de174d9036ac55e433112b07e6ea8022c4f02e0" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.552326 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.5523045460000002 podStartE2EDuration="2.552304546s" podCreationTimestamp="2026-03-18 12:36:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:36:12.55026843 +0000 UTC m=+1558.264669019" watchObservedRunningTime="2026-03-18 12:36:12.552304546 +0000 UTC m=+1558.266705125" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.557022 4975 scope.go:117] "RemoveContainer" containerID="b77e810d3b113694a9cd9bdd17555e4af68a6499707f894c9f4a3501b252d961" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.562295 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cfaf200-320b-4866-91d3-0a526ad54da3-logs\") pod \"0cfaf200-320b-4866-91d3-0a526ad54da3\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.562345 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cfaf200-320b-4866-91d3-0a526ad54da3-config-data\") pod \"0cfaf200-320b-4866-91d3-0a526ad54da3\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.562442 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cfaf200-320b-4866-91d3-0a526ad54da3-combined-ca-bundle\") pod \"0cfaf200-320b-4866-91d3-0a526ad54da3\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.562640 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkm44\" (UniqueName: \"kubernetes.io/projected/0cfaf200-320b-4866-91d3-0a526ad54da3-kube-api-access-rkm44\") pod \"0cfaf200-320b-4866-91d3-0a526ad54da3\" (UID: \"0cfaf200-320b-4866-91d3-0a526ad54da3\") " Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.562933 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cfaf200-320b-4866-91d3-0a526ad54da3-logs" (OuterVolumeSpecName: "logs") pod "0cfaf200-320b-4866-91d3-0a526ad54da3" (UID: "0cfaf200-320b-4866-91d3-0a526ad54da3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.563155 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cfaf200-320b-4866-91d3-0a526ad54da3-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.569733 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cfaf200-320b-4866-91d3-0a526ad54da3-kube-api-access-rkm44" (OuterVolumeSpecName: "kube-api-access-rkm44") pod "0cfaf200-320b-4866-91d3-0a526ad54da3" (UID: "0cfaf200-320b-4866-91d3-0a526ad54da3"). InnerVolumeSpecName "kube-api-access-rkm44". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.585883 4975 scope.go:117] "RemoveContainer" containerID="7ddb073d3c900d530fce42926de174d9036ac55e433112b07e6ea8022c4f02e0" Mar 18 12:36:12 crc kubenswrapper[4975]: E0318 12:36:12.588664 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ddb073d3c900d530fce42926de174d9036ac55e433112b07e6ea8022c4f02e0\": container with ID starting with 7ddb073d3c900d530fce42926de174d9036ac55e433112b07e6ea8022c4f02e0 not found: ID does not exist" containerID="7ddb073d3c900d530fce42926de174d9036ac55e433112b07e6ea8022c4f02e0" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.588717 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ddb073d3c900d530fce42926de174d9036ac55e433112b07e6ea8022c4f02e0"} err="failed to get container status \"7ddb073d3c900d530fce42926de174d9036ac55e433112b07e6ea8022c4f02e0\": rpc error: code = NotFound desc = could not find container \"7ddb073d3c900d530fce42926de174d9036ac55e433112b07e6ea8022c4f02e0\": container with ID starting with 7ddb073d3c900d530fce42926de174d9036ac55e433112b07e6ea8022c4f02e0 not found: ID does not exist" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.588746 4975 scope.go:117] "RemoveContainer" containerID="b77e810d3b113694a9cd9bdd17555e4af68a6499707f894c9f4a3501b252d961" Mar 18 12:36:12 crc kubenswrapper[4975]: E0318 12:36:12.589218 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b77e810d3b113694a9cd9bdd17555e4af68a6499707f894c9f4a3501b252d961\": container with ID starting with b77e810d3b113694a9cd9bdd17555e4af68a6499707f894c9f4a3501b252d961 not found: ID does not exist" containerID="b77e810d3b113694a9cd9bdd17555e4af68a6499707f894c9f4a3501b252d961" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.589261 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b77e810d3b113694a9cd9bdd17555e4af68a6499707f894c9f4a3501b252d961"} err="failed to get container status \"b77e810d3b113694a9cd9bdd17555e4af68a6499707f894c9f4a3501b252d961\": rpc error: code = NotFound desc = could not find container \"b77e810d3b113694a9cd9bdd17555e4af68a6499707f894c9f4a3501b252d961\": container with ID starting with b77e810d3b113694a9cd9bdd17555e4af68a6499707f894c9f4a3501b252d961 not found: ID does not exist" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.603359 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cfaf200-320b-4866-91d3-0a526ad54da3-config-data" (OuterVolumeSpecName: "config-data") pod "0cfaf200-320b-4866-91d3-0a526ad54da3" (UID: "0cfaf200-320b-4866-91d3-0a526ad54da3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.610514 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cfaf200-320b-4866-91d3-0a526ad54da3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cfaf200-320b-4866-91d3-0a526ad54da3" (UID: "0cfaf200-320b-4866-91d3-0a526ad54da3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.665115 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkm44\" (UniqueName: \"kubernetes.io/projected/0cfaf200-320b-4866-91d3-0a526ad54da3-kube-api-access-rkm44\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.665157 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cfaf200-320b-4866-91d3-0a526ad54da3-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.665166 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cfaf200-320b-4866-91d3-0a526ad54da3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.869905 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.883736 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.894972 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:12 crc kubenswrapper[4975]: E0318 12:36:12.895453 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cfaf200-320b-4866-91d3-0a526ad54da3" containerName="nova-api-api" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.895476 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfaf200-320b-4866-91d3-0a526ad54da3" containerName="nova-api-api" Mar 18 12:36:12 crc kubenswrapper[4975]: E0318 12:36:12.895493 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cfaf200-320b-4866-91d3-0a526ad54da3" containerName="nova-api-log" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.895502 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfaf200-320b-4866-91d3-0a526ad54da3" containerName="nova-api-log" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.895715 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cfaf200-320b-4866-91d3-0a526ad54da3" containerName="nova-api-api" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.895737 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cfaf200-320b-4866-91d3-0a526ad54da3" containerName="nova-api-log" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.896903 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.900131 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.909156 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.969854 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf0f90d-1780-4665-833b-020aaafbaacd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " pod="openstack/nova-api-0" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.970152 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ql7v\" (UniqueName: \"kubernetes.io/projected/bbf0f90d-1780-4665-833b-020aaafbaacd-kube-api-access-6ql7v\") pod \"nova-api-0\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " pod="openstack/nova-api-0" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.970335 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf0f90d-1780-4665-833b-020aaafbaacd-config-data\") pod \"nova-api-0\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " pod="openstack/nova-api-0" Mar 18 12:36:12 crc kubenswrapper[4975]: I0318 12:36:12.970600 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbf0f90d-1780-4665-833b-020aaafbaacd-logs\") pod \"nova-api-0\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " pod="openstack/nova-api-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.029995 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cfaf200-320b-4866-91d3-0a526ad54da3" path="/var/lib/kubelet/pods/0cfaf200-320b-4866-91d3-0a526ad54da3/volumes" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.074022 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbf0f90d-1780-4665-833b-020aaafbaacd-logs\") pod \"nova-api-0\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " pod="openstack/nova-api-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.074100 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf0f90d-1780-4665-833b-020aaafbaacd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " pod="openstack/nova-api-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.074139 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ql7v\" (UniqueName: \"kubernetes.io/projected/bbf0f90d-1780-4665-833b-020aaafbaacd-kube-api-access-6ql7v\") pod \"nova-api-0\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " pod="openstack/nova-api-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.074243 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf0f90d-1780-4665-833b-020aaafbaacd-config-data\") pod \"nova-api-0\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " pod="openstack/nova-api-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.076398 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbf0f90d-1780-4665-833b-020aaafbaacd-logs\") pod \"nova-api-0\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " pod="openstack/nova-api-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.082243 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf0f90d-1780-4665-833b-020aaafbaacd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " pod="openstack/nova-api-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.083558 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf0f90d-1780-4665-833b-020aaafbaacd-config-data\") pod \"nova-api-0\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " pod="openstack/nova-api-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.093957 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ql7v\" (UniqueName: \"kubernetes.io/projected/bbf0f90d-1780-4665-833b-020aaafbaacd-kube-api-access-6ql7v\") pod \"nova-api-0\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " pod="openstack/nova-api-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.250380 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.336703 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.380413 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl92p\" (UniqueName: \"kubernetes.io/projected/91c2909b-ac35-4304-a0cb-571293d436e0-kube-api-access-bl92p\") pod \"91c2909b-ac35-4304-a0cb-571293d436e0\" (UID: \"91c2909b-ac35-4304-a0cb-571293d436e0\") " Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.380515 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91c2909b-ac35-4304-a0cb-571293d436e0-config-data\") pod \"91c2909b-ac35-4304-a0cb-571293d436e0\" (UID: \"91c2909b-ac35-4304-a0cb-571293d436e0\") " Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.380732 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91c2909b-ac35-4304-a0cb-571293d436e0-combined-ca-bundle\") pod \"91c2909b-ac35-4304-a0cb-571293d436e0\" (UID: \"91c2909b-ac35-4304-a0cb-571293d436e0\") " Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.387085 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91c2909b-ac35-4304-a0cb-571293d436e0-kube-api-access-bl92p" (OuterVolumeSpecName: "kube-api-access-bl92p") pod "91c2909b-ac35-4304-a0cb-571293d436e0" (UID: "91c2909b-ac35-4304-a0cb-571293d436e0"). InnerVolumeSpecName "kube-api-access-bl92p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.414703 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91c2909b-ac35-4304-a0cb-571293d436e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91c2909b-ac35-4304-a0cb-571293d436e0" (UID: "91c2909b-ac35-4304-a0cb-571293d436e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.414987 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91c2909b-ac35-4304-a0cb-571293d436e0-config-data" (OuterVolumeSpecName: "config-data") pod "91c2909b-ac35-4304-a0cb-571293d436e0" (UID: "91c2909b-ac35-4304-a0cb-571293d436e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.483028 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl92p\" (UniqueName: \"kubernetes.io/projected/91c2909b-ac35-4304-a0cb-571293d436e0-kube-api-access-bl92p\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.483065 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91c2909b-ac35-4304-a0cb-571293d436e0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.483076 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91c2909b-ac35-4304-a0cb-571293d436e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.545714 4975 generic.go:334] "Generic (PLEG): container finished" podID="91c2909b-ac35-4304-a0cb-571293d436e0" containerID="c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2" exitCode=0 Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.545912 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"91c2909b-ac35-4304-a0cb-571293d436e0","Type":"ContainerDied","Data":"c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2"} Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.545958 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"91c2909b-ac35-4304-a0cb-571293d436e0","Type":"ContainerDied","Data":"e8c9715915be5c9e20dd575ab7568ee1f4e5e14ab7d2970bb7783b01878011a8"} Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.545965 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.545986 4975 scope.go:117] "RemoveContainer" containerID="c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.584112 4975 scope.go:117] "RemoveContainer" containerID="c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2" Mar 18 12:36:13 crc kubenswrapper[4975]: E0318 12:36:13.585360 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2\": container with ID starting with c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2 not found: ID does not exist" containerID="c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.585410 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2"} err="failed to get container status \"c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2\": rpc error: code = NotFound desc = could not find container \"c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2\": container with ID starting with c416020b5a7783b2dc257e98e8a8c7bba8638d8b01679d9b17a2013331ec7ad2 not found: ID does not exist" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.614176 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.626595 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.639305 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:36:13 crc kubenswrapper[4975]: E0318 12:36:13.639724 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c2909b-ac35-4304-a0cb-571293d436e0" containerName="nova-scheduler-scheduler" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.639736 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c2909b-ac35-4304-a0cb-571293d436e0" containerName="nova-scheduler-scheduler" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.639932 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c2909b-ac35-4304-a0cb-571293d436e0" containerName="nova-scheduler-scheduler" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.640558 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.643250 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.650041 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.686086 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jr9d\" (UniqueName: \"kubernetes.io/projected/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-kube-api-access-5jr9d\") pod \"nova-scheduler-0\" (UID: \"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.686443 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-config-data\") pod \"nova-scheduler-0\" (UID: \"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.686789 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.788207 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-config-data\") pod \"nova-scheduler-0\" (UID: \"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.788340 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.788435 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jr9d\" (UniqueName: \"kubernetes.io/projected/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-kube-api-access-5jr9d\") pod \"nova-scheduler-0\" (UID: \"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.797889 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-config-data\") pod \"nova-scheduler-0\" (UID: \"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.798352 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.813432 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jr9d\" (UniqueName: \"kubernetes.io/projected/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-kube-api-access-5jr9d\") pod \"nova-scheduler-0\" (UID: \"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.819300 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:13 crc kubenswrapper[4975]: W0318 12:36:13.822442 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbf0f90d_1780_4665_833b_020aaafbaacd.slice/crio-bff209782cf53492c1e3c6bd2672d2a412c94698989f923af6f1ca63a2a0b0ca WatchSource:0}: Error finding container bff209782cf53492c1e3c6bd2672d2a412c94698989f923af6f1ca63a2a0b0ca: Status 404 returned error can't find the container with id bff209782cf53492c1e3c6bd2672d2a412c94698989f923af6f1ca63a2a0b0ca Mar 18 12:36:13 crc kubenswrapper[4975]: I0318 12:36:13.968562 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:36:14 crc kubenswrapper[4975]: I0318 12:36:14.238494 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 12:36:14 crc kubenswrapper[4975]: I0318 12:36:14.494726 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:36:14 crc kubenswrapper[4975]: W0318 12:36:14.498934 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c9cc2f1_6af5_46fa_8d2b_9a73dd90c46d.slice/crio-bc57be8c0e2afef01d315330ad15cf21b7d2ba9202238b117de4a08c9bfb23e7 WatchSource:0}: Error finding container bc57be8c0e2afef01d315330ad15cf21b7d2ba9202238b117de4a08c9bfb23e7: Status 404 returned error can't find the container with id bc57be8c0e2afef01d315330ad15cf21b7d2ba9202238b117de4a08c9bfb23e7 Mar 18 12:36:14 crc kubenswrapper[4975]: I0318 12:36:14.556748 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d","Type":"ContainerStarted","Data":"bc57be8c0e2afef01d315330ad15cf21b7d2ba9202238b117de4a08c9bfb23e7"} Mar 18 12:36:14 crc kubenswrapper[4975]: I0318 12:36:14.562980 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bbf0f90d-1780-4665-833b-020aaafbaacd","Type":"ContainerStarted","Data":"ef72e9d61169a2aaad66521c7e0d13fb51421cabda508c0013d968c503766392"} Mar 18 12:36:14 crc kubenswrapper[4975]: I0318 12:36:14.563133 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bbf0f90d-1780-4665-833b-020aaafbaacd","Type":"ContainerStarted","Data":"4843ecb1e52f3efe39497b57d5d6637ae88ffd3b5070ca737bc2ed066dedfc48"} Mar 18 12:36:14 crc kubenswrapper[4975]: I0318 12:36:14.563154 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bbf0f90d-1780-4665-833b-020aaafbaacd","Type":"ContainerStarted","Data":"bff209782cf53492c1e3c6bd2672d2a412c94698989f923af6f1ca63a2a0b0ca"} Mar 18 12:36:14 crc kubenswrapper[4975]: I0318 12:36:14.587230 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5872068710000002 podStartE2EDuration="2.587206871s" podCreationTimestamp="2026-03-18 12:36:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:36:14.579778336 +0000 UTC m=+1560.294178925" watchObservedRunningTime="2026-03-18 12:36:14.587206871 +0000 UTC m=+1560.301607450" Mar 18 12:36:15 crc kubenswrapper[4975]: I0318 12:36:15.027650 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91c2909b-ac35-4304-a0cb-571293d436e0" path="/var/lib/kubelet/pods/91c2909b-ac35-4304-a0cb-571293d436e0/volumes" Mar 18 12:36:15 crc kubenswrapper[4975]: I0318 12:36:15.573976 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d","Type":"ContainerStarted","Data":"f38549335eef235d6b275539bdbfb2e1791de8d5bb1c3039dd77d107588cbd6d"} Mar 18 12:36:15 crc kubenswrapper[4975]: I0318 12:36:15.594045 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.594026901 podStartE2EDuration="2.594026901s" podCreationTimestamp="2026-03-18 12:36:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:36:15.592970422 +0000 UTC m=+1561.307371001" watchObservedRunningTime="2026-03-18 12:36:15.594026901 +0000 UTC m=+1561.308427480" Mar 18 12:36:17 crc kubenswrapper[4975]: I0318 12:36:17.781257 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:36:17 crc kubenswrapper[4975]: I0318 12:36:17.781510 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1e74832f-4c39-440a-b1b7-d87c9916fc94" containerName="kube-state-metrics" containerID="cri-o://1c1030fbb596f29bc3231abc7eb7114a80dd1ef168931bd4854f91a4c9cf5a6b" gracePeriod=30 Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.282370 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.469077 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlrkn\" (UniqueName: \"kubernetes.io/projected/1e74832f-4c39-440a-b1b7-d87c9916fc94-kube-api-access-hlrkn\") pod \"1e74832f-4c39-440a-b1b7-d87c9916fc94\" (UID: \"1e74832f-4c39-440a-b1b7-d87c9916fc94\") " Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.474912 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e74832f-4c39-440a-b1b7-d87c9916fc94-kube-api-access-hlrkn" (OuterVolumeSpecName: "kube-api-access-hlrkn") pod "1e74832f-4c39-440a-b1b7-d87c9916fc94" (UID: "1e74832f-4c39-440a-b1b7-d87c9916fc94"). InnerVolumeSpecName "kube-api-access-hlrkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.571550 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlrkn\" (UniqueName: \"kubernetes.io/projected/1e74832f-4c39-440a-b1b7-d87c9916fc94-kube-api-access-hlrkn\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.601462 4975 generic.go:334] "Generic (PLEG): container finished" podID="1e74832f-4c39-440a-b1b7-d87c9916fc94" containerID="1c1030fbb596f29bc3231abc7eb7114a80dd1ef168931bd4854f91a4c9cf5a6b" exitCode=2 Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.601504 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.601513 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1e74832f-4c39-440a-b1b7-d87c9916fc94","Type":"ContainerDied","Data":"1c1030fbb596f29bc3231abc7eb7114a80dd1ef168931bd4854f91a4c9cf5a6b"} Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.601543 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1e74832f-4c39-440a-b1b7-d87c9916fc94","Type":"ContainerDied","Data":"3d7975d9fbcc401f22a0d0e8599d4c1696617cca354b768070ff49ec880caaf1"} Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.601563 4975 scope.go:117] "RemoveContainer" containerID="1c1030fbb596f29bc3231abc7eb7114a80dd1ef168931bd4854f91a4c9cf5a6b" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.625360 4975 scope.go:117] "RemoveContainer" containerID="1c1030fbb596f29bc3231abc7eb7114a80dd1ef168931bd4854f91a4c9cf5a6b" Mar 18 12:36:18 crc kubenswrapper[4975]: E0318 12:36:18.625960 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c1030fbb596f29bc3231abc7eb7114a80dd1ef168931bd4854f91a4c9cf5a6b\": container with ID starting with 1c1030fbb596f29bc3231abc7eb7114a80dd1ef168931bd4854f91a4c9cf5a6b not found: ID does not exist" containerID="1c1030fbb596f29bc3231abc7eb7114a80dd1ef168931bd4854f91a4c9cf5a6b" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.626017 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c1030fbb596f29bc3231abc7eb7114a80dd1ef168931bd4854f91a4c9cf5a6b"} err="failed to get container status \"1c1030fbb596f29bc3231abc7eb7114a80dd1ef168931bd4854f91a4c9cf5a6b\": rpc error: code = NotFound desc = could not find container \"1c1030fbb596f29bc3231abc7eb7114a80dd1ef168931bd4854f91a4c9cf5a6b\": container with ID starting with 1c1030fbb596f29bc3231abc7eb7114a80dd1ef168931bd4854f91a4c9cf5a6b not found: ID does not exist" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.640675 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.655789 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.669886 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:36:18 crc kubenswrapper[4975]: E0318 12:36:18.670440 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e74832f-4c39-440a-b1b7-d87c9916fc94" containerName="kube-state-metrics" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.670464 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e74832f-4c39-440a-b1b7-d87c9916fc94" containerName="kube-state-metrics" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.670667 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e74832f-4c39-440a-b1b7-d87c9916fc94" containerName="kube-state-metrics" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.671434 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.673156 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6eeb6017-b985-4f5c-ace2-b24c9ad25510-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6eeb6017-b985-4f5c-ace2-b24c9ad25510\") " pod="openstack/kube-state-metrics-0" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.673219 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eeb6017-b985-4f5c-ace2-b24c9ad25510-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6eeb6017-b985-4f5c-ace2-b24c9ad25510\") " pod="openstack/kube-state-metrics-0" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.673313 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls75w\" (UniqueName: \"kubernetes.io/projected/6eeb6017-b985-4f5c-ace2-b24c9ad25510-kube-api-access-ls75w\") pod \"kube-state-metrics-0\" (UID: \"6eeb6017-b985-4f5c-ace2-b24c9ad25510\") " pod="openstack/kube-state-metrics-0" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.673345 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eeb6017-b985-4f5c-ace2-b24c9ad25510-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6eeb6017-b985-4f5c-ace2-b24c9ad25510\") " pod="openstack/kube-state-metrics-0" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.678641 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.678891 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.698750 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.775300 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6eeb6017-b985-4f5c-ace2-b24c9ad25510-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6eeb6017-b985-4f5c-ace2-b24c9ad25510\") " pod="openstack/kube-state-metrics-0" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.775399 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eeb6017-b985-4f5c-ace2-b24c9ad25510-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6eeb6017-b985-4f5c-ace2-b24c9ad25510\") " pod="openstack/kube-state-metrics-0" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.775514 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls75w\" (UniqueName: \"kubernetes.io/projected/6eeb6017-b985-4f5c-ace2-b24c9ad25510-kube-api-access-ls75w\") pod \"kube-state-metrics-0\" (UID: \"6eeb6017-b985-4f5c-ace2-b24c9ad25510\") " pod="openstack/kube-state-metrics-0" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.775551 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eeb6017-b985-4f5c-ace2-b24c9ad25510-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6eeb6017-b985-4f5c-ace2-b24c9ad25510\") " pod="openstack/kube-state-metrics-0" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.782430 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eeb6017-b985-4f5c-ace2-b24c9ad25510-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6eeb6017-b985-4f5c-ace2-b24c9ad25510\") " pod="openstack/kube-state-metrics-0" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.783784 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6eeb6017-b985-4f5c-ace2-b24c9ad25510-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6eeb6017-b985-4f5c-ace2-b24c9ad25510\") " pod="openstack/kube-state-metrics-0" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.784785 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6eeb6017-b985-4f5c-ace2-b24c9ad25510-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6eeb6017-b985-4f5c-ace2-b24c9ad25510\") " pod="openstack/kube-state-metrics-0" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.810572 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls75w\" (UniqueName: \"kubernetes.io/projected/6eeb6017-b985-4f5c-ace2-b24c9ad25510-kube-api-access-ls75w\") pod \"kube-state-metrics-0\" (UID: \"6eeb6017-b985-4f5c-ace2-b24c9ad25510\") " pod="openstack/kube-state-metrics-0" Mar 18 12:36:18 crc kubenswrapper[4975]: I0318 12:36:18.968981 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 12:36:19 crc kubenswrapper[4975]: I0318 12:36:19.000663 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 12:36:19 crc kubenswrapper[4975]: I0318 12:36:19.027465 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e74832f-4c39-440a-b1b7-d87c9916fc94" path="/var/lib/kubelet/pods/1e74832f-4c39-440a-b1b7-d87c9916fc94/volumes" Mar 18 12:36:19 crc kubenswrapper[4975]: I0318 12:36:19.437038 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 12:36:19 crc kubenswrapper[4975]: W0318 12:36:19.442106 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6eeb6017_b985_4f5c_ace2_b24c9ad25510.slice/crio-e1972766c0a0090100ef62d62d63fed5f9025c30ef509c0dc8f4d1eda9ba676f WatchSource:0}: Error finding container e1972766c0a0090100ef62d62d63fed5f9025c30ef509c0dc8f4d1eda9ba676f: Status 404 returned error can't find the container with id e1972766c0a0090100ef62d62d63fed5f9025c30ef509c0dc8f4d1eda9ba676f Mar 18 12:36:19 crc kubenswrapper[4975]: I0318 12:36:19.444663 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:36:19 crc kubenswrapper[4975]: I0318 12:36:19.612305 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6eeb6017-b985-4f5c-ace2-b24c9ad25510","Type":"ContainerStarted","Data":"e1972766c0a0090100ef62d62d63fed5f9025c30ef509c0dc8f4d1eda9ba676f"} Mar 18 12:36:19 crc kubenswrapper[4975]: I0318 12:36:19.654545 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:36:19 crc kubenswrapper[4975]: I0318 12:36:19.654825 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="ceilometer-central-agent" containerID="cri-o://1d32409bb252e66135d85d9d3d736cbbd7365f4130d69183d7980941d8f1a808" gracePeriod=30 Mar 18 12:36:19 crc kubenswrapper[4975]: I0318 12:36:19.654947 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="sg-core" containerID="cri-o://4b4773bb8da1584587ad8d02c86557a00680cf7450556e60d81a2651d867f63c" gracePeriod=30 Mar 18 12:36:19 crc kubenswrapper[4975]: I0318 12:36:19.654990 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="proxy-httpd" containerID="cri-o://9c9c27358654152e920831ce36019dd4442ef7b156e35d1397945b014ca71ed2" gracePeriod=30 Mar 18 12:36:19 crc kubenswrapper[4975]: I0318 12:36:19.654952 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="ceilometer-notification-agent" containerID="cri-o://d554b1376eaa787c52f9cfd715e1e7d55db9cd1ef6d746bd598887ad9279dc9b" gracePeriod=30 Mar 18 12:36:19 crc kubenswrapper[4975]: I0318 12:36:19.971655 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 12:36:20 crc kubenswrapper[4975]: I0318 12:36:20.626970 4975 generic.go:334] "Generic (PLEG): container finished" podID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerID="9c9c27358654152e920831ce36019dd4442ef7b156e35d1397945b014ca71ed2" exitCode=0 Mar 18 12:36:20 crc kubenswrapper[4975]: I0318 12:36:20.627003 4975 generic.go:334] "Generic (PLEG): container finished" podID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerID="4b4773bb8da1584587ad8d02c86557a00680cf7450556e60d81a2651d867f63c" exitCode=2 Mar 18 12:36:20 crc kubenswrapper[4975]: I0318 12:36:20.627010 4975 generic.go:334] "Generic (PLEG): container finished" podID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerID="1d32409bb252e66135d85d9d3d736cbbd7365f4130d69183d7980941d8f1a808" exitCode=0 Mar 18 12:36:20 crc kubenswrapper[4975]: I0318 12:36:20.627039 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e","Type":"ContainerDied","Data":"9c9c27358654152e920831ce36019dd4442ef7b156e35d1397945b014ca71ed2"} Mar 18 12:36:20 crc kubenswrapper[4975]: I0318 12:36:20.627094 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e","Type":"ContainerDied","Data":"4b4773bb8da1584587ad8d02c86557a00680cf7450556e60d81a2651d867f63c"} Mar 18 12:36:20 crc kubenswrapper[4975]: I0318 12:36:20.627111 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e","Type":"ContainerDied","Data":"1d32409bb252e66135d85d9d3d736cbbd7365f4130d69183d7980941d8f1a808"} Mar 18 12:36:20 crc kubenswrapper[4975]: I0318 12:36:20.628730 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6eeb6017-b985-4f5c-ace2-b24c9ad25510","Type":"ContainerStarted","Data":"2251f2a25f88c12cdb7b08c8cb48c61a17e53fe3b1b1a68c6e02c662b5c1d1ed"} Mar 18 12:36:20 crc kubenswrapper[4975]: I0318 12:36:20.628922 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 12:36:20 crc kubenswrapper[4975]: I0318 12:36:20.652917 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.295621583 podStartE2EDuration="2.652896433s" podCreationTimestamp="2026-03-18 12:36:18 +0000 UTC" firstStartedPulling="2026-03-18 12:36:19.444374848 +0000 UTC m=+1565.158775427" lastFinishedPulling="2026-03-18 12:36:19.801649698 +0000 UTC m=+1565.516050277" observedRunningTime="2026-03-18 12:36:20.64658573 +0000 UTC m=+1566.360986349" watchObservedRunningTime="2026-03-18 12:36:20.652896433 +0000 UTC m=+1566.367297022" Mar 18 12:36:20 crc kubenswrapper[4975]: I0318 12:36:20.899339 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 12:36:20 crc kubenswrapper[4975]: I0318 12:36:20.899393 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 12:36:21 crc kubenswrapper[4975]: I0318 12:36:21.917158 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="790e3570-2044-4dea-b894-5222e6d3d2e9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:36:21 crc kubenswrapper[4975]: I0318 12:36:21.917169 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="790e3570-2044-4dea-b894-5222e6d3d2e9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:36:23 crc kubenswrapper[4975]: I0318 12:36:23.251182 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:36:23 crc kubenswrapper[4975]: I0318 12:36:23.251620 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:36:23 crc kubenswrapper[4975]: I0318 12:36:23.676976 4975 generic.go:334] "Generic (PLEG): container finished" podID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerID="d554b1376eaa787c52f9cfd715e1e7d55db9cd1ef6d746bd598887ad9279dc9b" exitCode=0 Mar 18 12:36:23 crc kubenswrapper[4975]: I0318 12:36:23.677027 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e","Type":"ContainerDied","Data":"d554b1376eaa787c52f9cfd715e1e7d55db9cd1ef6d746bd598887ad9279dc9b"} Mar 18 12:36:23 crc kubenswrapper[4975]: I0318 12:36:23.969305 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 12:36:23 crc kubenswrapper[4975]: I0318 12:36:23.972827 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.028919 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.111284 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-run-httpd\") pod \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.111752 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-config-data\") pod \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.111931 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-sg-core-conf-yaml\") pod \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.112073 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-log-httpd\") pod \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.111965 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" (UID: "fb5fde15-fc46-4eb2-a28a-e0c5ebae192e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.112397 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-combined-ca-bundle\") pod \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.112596 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-scripts\") pod \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.112798 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnfnz\" (UniqueName: \"kubernetes.io/projected/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-kube-api-access-nnfnz\") pod \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\" (UID: \"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e\") " Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.112437 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" (UID: "fb5fde15-fc46-4eb2-a28a-e0c5ebae192e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.113680 4975 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.113785 4975 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.125260 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-kube-api-access-nnfnz" (OuterVolumeSpecName: "kube-api-access-nnfnz") pod "fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" (UID: "fb5fde15-fc46-4eb2-a28a-e0c5ebae192e"). InnerVolumeSpecName "kube-api-access-nnfnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.125485 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-scripts" (OuterVolumeSpecName: "scripts") pod "fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" (UID: "fb5fde15-fc46-4eb2-a28a-e0c5ebae192e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.164178 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" (UID: "fb5fde15-fc46-4eb2-a28a-e0c5ebae192e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.199130 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" (UID: "fb5fde15-fc46-4eb2-a28a-e0c5ebae192e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.215961 4975 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.215998 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.216012 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.216023 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnfnz\" (UniqueName: \"kubernetes.io/projected/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-kube-api-access-nnfnz\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.230883 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-config-data" (OuterVolumeSpecName: "config-data") pod "fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" (UID: "fb5fde15-fc46-4eb2-a28a-e0c5ebae192e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.318473 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.335049 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bbf0f90d-1780-4665-833b-020aaafbaacd" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.205:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.335085 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bbf0f90d-1780-4665-833b-020aaafbaacd" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.205:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.692987 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb5fde15-fc46-4eb2-a28a-e0c5ebae192e","Type":"ContainerDied","Data":"d4c2904fce68647bd81972977093587aa63fbd9c56e66a37c6156da842fc3295"} Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.693356 4975 scope.go:117] "RemoveContainer" containerID="9c9c27358654152e920831ce36019dd4442ef7b156e35d1397945b014ca71ed2" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.693037 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.723264 4975 scope.go:117] "RemoveContainer" containerID="4b4773bb8da1584587ad8d02c86557a00680cf7450556e60d81a2651d867f63c" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.741021 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.749206 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.757500 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.757530 4975 scope.go:117] "RemoveContainer" containerID="d554b1376eaa787c52f9cfd715e1e7d55db9cd1ef6d746bd598887ad9279dc9b" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.787052 4975 scope.go:117] "RemoveContainer" containerID="1d32409bb252e66135d85d9d3d736cbbd7365f4130d69183d7980941d8f1a808" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.790811 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:36:24 crc kubenswrapper[4975]: E0318 12:36:24.793332 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="proxy-httpd" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.793361 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="proxy-httpd" Mar 18 12:36:24 crc kubenswrapper[4975]: E0318 12:36:24.793380 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="sg-core" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.793388 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="sg-core" Mar 18 12:36:24 crc kubenswrapper[4975]: E0318 12:36:24.793408 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="ceilometer-central-agent" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.793414 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="ceilometer-central-agent" Mar 18 12:36:24 crc kubenswrapper[4975]: E0318 12:36:24.793427 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="ceilometer-notification-agent" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.793432 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="ceilometer-notification-agent" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.793657 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="ceilometer-central-agent" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.793672 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="ceilometer-notification-agent" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.793684 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="sg-core" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.793695 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" containerName="proxy-httpd" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.796046 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.798587 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.799210 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.799471 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.804534 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.836967 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2tf7\" (UniqueName: \"kubernetes.io/projected/d8862e7b-f638-466a-a1b1-8fa24fef7309-kube-api-access-f2tf7\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.837292 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.837394 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-scripts\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.837493 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8862e7b-f638-466a-a1b1-8fa24fef7309-log-httpd\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.837564 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-config-data\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.837621 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.837697 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.837769 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8862e7b-f638-466a-a1b1-8fa24fef7309-run-httpd\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.940041 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2tf7\" (UniqueName: \"kubernetes.io/projected/d8862e7b-f638-466a-a1b1-8fa24fef7309-kube-api-access-f2tf7\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.940109 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.940162 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-scripts\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.940219 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8862e7b-f638-466a-a1b1-8fa24fef7309-log-httpd\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.940294 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-config-data\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.940333 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.940376 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.940429 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8862e7b-f638-466a-a1b1-8fa24fef7309-run-httpd\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.940843 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8862e7b-f638-466a-a1b1-8fa24fef7309-log-httpd\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.941606 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8862e7b-f638-466a-a1b1-8fa24fef7309-run-httpd\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.946125 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.946320 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-scripts\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.947487 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-config-data\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.947501 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.948473 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:24 crc kubenswrapper[4975]: I0318 12:36:24.963553 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2tf7\" (UniqueName: \"kubernetes.io/projected/d8862e7b-f638-466a-a1b1-8fa24fef7309-kube-api-access-f2tf7\") pod \"ceilometer-0\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " pod="openstack/ceilometer-0" Mar 18 12:36:25 crc kubenswrapper[4975]: I0318 12:36:25.032333 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb5fde15-fc46-4eb2-a28a-e0c5ebae192e" path="/var/lib/kubelet/pods/fb5fde15-fc46-4eb2-a28a-e0c5ebae192e/volumes" Mar 18 12:36:25 crc kubenswrapper[4975]: I0318 12:36:25.137840 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:36:25 crc kubenswrapper[4975]: I0318 12:36:25.539030 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:36:25 crc kubenswrapper[4975]: I0318 12:36:25.539082 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:36:25 crc kubenswrapper[4975]: I0318 12:36:25.539122 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:36:25 crc kubenswrapper[4975]: I0318 12:36:25.539689 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:36:25 crc kubenswrapper[4975]: I0318 12:36:25.539739 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" gracePeriod=600 Mar 18 12:36:25 crc kubenswrapper[4975]: I0318 12:36:25.640184 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:36:25 crc kubenswrapper[4975]: W0318 12:36:25.645132 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8862e7b_f638_466a_a1b1_8fa24fef7309.slice/crio-d820875e1db7b16b5486889be1ccc0b111294041fb4f50ec3290720568590b33 WatchSource:0}: Error finding container d820875e1db7b16b5486889be1ccc0b111294041fb4f50ec3290720568590b33: Status 404 returned error can't find the container with id d820875e1db7b16b5486889be1ccc0b111294041fb4f50ec3290720568590b33 Mar 18 12:36:25 crc kubenswrapper[4975]: E0318 12:36:25.663707 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:36:25 crc kubenswrapper[4975]: I0318 12:36:25.731748 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" exitCode=0 Mar 18 12:36:25 crc kubenswrapper[4975]: I0318 12:36:25.731814 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff"} Mar 18 12:36:25 crc kubenswrapper[4975]: I0318 12:36:25.731914 4975 scope.go:117] "RemoveContainer" containerID="d846cb3e61bc67fa3212660cebeebcacd3a57cbf2e5bcba7bd344d98d42cef45" Mar 18 12:36:25 crc kubenswrapper[4975]: I0318 12:36:25.732698 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:36:25 crc kubenswrapper[4975]: E0318 12:36:25.733157 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:36:25 crc kubenswrapper[4975]: I0318 12:36:25.738752 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8862e7b-f638-466a-a1b1-8fa24fef7309","Type":"ContainerStarted","Data":"d820875e1db7b16b5486889be1ccc0b111294041fb4f50ec3290720568590b33"} Mar 18 12:36:26 crc kubenswrapper[4975]: I0318 12:36:26.749744 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8862e7b-f638-466a-a1b1-8fa24fef7309","Type":"ContainerStarted","Data":"a2ec6b75355cff0b12414d356a935a6d88d3be020d844ed1a3a688b4d579039e"} Mar 18 12:36:27 crc kubenswrapper[4975]: I0318 12:36:27.767521 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8862e7b-f638-466a-a1b1-8fa24fef7309","Type":"ContainerStarted","Data":"fa27b19dfc9be706f4c8776b4350a5923206f6870cdb7ddd91f8af3798c0f0e0"} Mar 18 12:36:28 crc kubenswrapper[4975]: I0318 12:36:28.778280 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8862e7b-f638-466a-a1b1-8fa24fef7309","Type":"ContainerStarted","Data":"124d2e2517104363e8f9ab34b0b1d98cd22494286dbe7daa1c67c6b3ec0ba82f"} Mar 18 12:36:28 crc kubenswrapper[4975]: I0318 12:36:28.899154 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 12:36:28 crc kubenswrapper[4975]: I0318 12:36:28.899231 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 12:36:29 crc kubenswrapper[4975]: I0318 12:36:29.011700 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 12:36:30 crc kubenswrapper[4975]: I0318 12:36:30.801331 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8862e7b-f638-466a-a1b1-8fa24fef7309","Type":"ContainerStarted","Data":"e8ba2729244676f1858374ecbed1b560718997ed1c67174df2906da1583c0d87"} Mar 18 12:36:30 crc kubenswrapper[4975]: I0318 12:36:30.801789 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:36:30 crc kubenswrapper[4975]: I0318 12:36:30.904731 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 12:36:30 crc kubenswrapper[4975]: I0318 12:36:30.908782 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 12:36:30 crc kubenswrapper[4975]: I0318 12:36:30.910370 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 12:36:30 crc kubenswrapper[4975]: I0318 12:36:30.924053 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.755912243 podStartE2EDuration="6.924034472s" podCreationTimestamp="2026-03-18 12:36:24 +0000 UTC" firstStartedPulling="2026-03-18 12:36:25.648529413 +0000 UTC m=+1571.362929992" lastFinishedPulling="2026-03-18 12:36:29.816651642 +0000 UTC m=+1575.531052221" observedRunningTime="2026-03-18 12:36:30.828302035 +0000 UTC m=+1576.542702614" watchObservedRunningTime="2026-03-18 12:36:30.924034472 +0000 UTC m=+1576.638435051" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.251372 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.251425 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.664449 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.783804 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/191e985b-8564-4dc3-b05d-c6e4fef796a8-config-data\") pod \"191e985b-8564-4dc3-b05d-c6e4fef796a8\" (UID: \"191e985b-8564-4dc3-b05d-c6e4fef796a8\") " Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.783850 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttcs5\" (UniqueName: \"kubernetes.io/projected/191e985b-8564-4dc3-b05d-c6e4fef796a8-kube-api-access-ttcs5\") pod \"191e985b-8564-4dc3-b05d-c6e4fef796a8\" (UID: \"191e985b-8564-4dc3-b05d-c6e4fef796a8\") " Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.783911 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/191e985b-8564-4dc3-b05d-c6e4fef796a8-combined-ca-bundle\") pod \"191e985b-8564-4dc3-b05d-c6e4fef796a8\" (UID: \"191e985b-8564-4dc3-b05d-c6e4fef796a8\") " Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.797793 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/191e985b-8564-4dc3-b05d-c6e4fef796a8-kube-api-access-ttcs5" (OuterVolumeSpecName: "kube-api-access-ttcs5") pod "191e985b-8564-4dc3-b05d-c6e4fef796a8" (UID: "191e985b-8564-4dc3-b05d-c6e4fef796a8"). InnerVolumeSpecName "kube-api-access-ttcs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.813772 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/191e985b-8564-4dc3-b05d-c6e4fef796a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "191e985b-8564-4dc3-b05d-c6e4fef796a8" (UID: "191e985b-8564-4dc3-b05d-c6e4fef796a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.815429 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/191e985b-8564-4dc3-b05d-c6e4fef796a8-config-data" (OuterVolumeSpecName: "config-data") pod "191e985b-8564-4dc3-b05d-c6e4fef796a8" (UID: "191e985b-8564-4dc3-b05d-c6e4fef796a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.821894 4975 generic.go:334] "Generic (PLEG): container finished" podID="191e985b-8564-4dc3-b05d-c6e4fef796a8" containerID="1d1d6372172db9d73ca66a3f56330445f89ed334db52d142c62b65c78a7e7a11" exitCode=137 Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.822923 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"191e985b-8564-4dc3-b05d-c6e4fef796a8","Type":"ContainerDied","Data":"1d1d6372172db9d73ca66a3f56330445f89ed334db52d142c62b65c78a7e7a11"} Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.822981 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"191e985b-8564-4dc3-b05d-c6e4fef796a8","Type":"ContainerDied","Data":"54f401288b23a7c761a65a6d72f89dc093dd41d565aaa0e58c50933397792d23"} Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.823000 4975 scope.go:117] "RemoveContainer" containerID="1d1d6372172db9d73ca66a3f56330445f89ed334db52d142c62b65c78a7e7a11" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.823187 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.829166 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.886434 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/191e985b-8564-4dc3-b05d-c6e4fef796a8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.886598 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttcs5\" (UniqueName: \"kubernetes.io/projected/191e985b-8564-4dc3-b05d-c6e4fef796a8-kube-api-access-ttcs5\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.886694 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/191e985b-8564-4dc3-b05d-c6e4fef796a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.943283 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.961216 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.961455 4975 scope.go:117] "RemoveContainer" containerID="1d1d6372172db9d73ca66a3f56330445f89ed334db52d142c62b65c78a7e7a11" Mar 18 12:36:31 crc kubenswrapper[4975]: E0318 12:36:31.963768 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1d6372172db9d73ca66a3f56330445f89ed334db52d142c62b65c78a7e7a11\": container with ID starting with 1d1d6372172db9d73ca66a3f56330445f89ed334db52d142c62b65c78a7e7a11 not found: ID does not exist" containerID="1d1d6372172db9d73ca66a3f56330445f89ed334db52d142c62b65c78a7e7a11" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.963826 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1d6372172db9d73ca66a3f56330445f89ed334db52d142c62b65c78a7e7a11"} err="failed to get container status \"1d1d6372172db9d73ca66a3f56330445f89ed334db52d142c62b65c78a7e7a11\": rpc error: code = NotFound desc = could not find container \"1d1d6372172db9d73ca66a3f56330445f89ed334db52d142c62b65c78a7e7a11\": container with ID starting with 1d1d6372172db9d73ca66a3f56330445f89ed334db52d142c62b65c78a7e7a11 not found: ID does not exist" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.974171 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:36:31 crc kubenswrapper[4975]: E0318 12:36:31.974717 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="191e985b-8564-4dc3-b05d-c6e4fef796a8" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.974735 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="191e985b-8564-4dc3-b05d-c6e4fef796a8" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.974940 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="191e985b-8564-4dc3-b05d-c6e4fef796a8" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.975489 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.978703 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.978729 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.979493 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 18 12:36:31 crc kubenswrapper[4975]: I0318 12:36:31.986720 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.090806 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5c52261-89b7-4457-a0b6-4380a99ffa2a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5c52261-89b7-4457-a0b6-4380a99ffa2a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.090896 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5c52261-89b7-4457-a0b6-4380a99ffa2a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5c52261-89b7-4457-a0b6-4380a99ffa2a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.090926 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5c52261-89b7-4457-a0b6-4380a99ffa2a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5c52261-89b7-4457-a0b6-4380a99ffa2a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.090997 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62jqn\" (UniqueName: \"kubernetes.io/projected/f5c52261-89b7-4457-a0b6-4380a99ffa2a-kube-api-access-62jqn\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5c52261-89b7-4457-a0b6-4380a99ffa2a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.091460 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5c52261-89b7-4457-a0b6-4380a99ffa2a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5c52261-89b7-4457-a0b6-4380a99ffa2a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.192968 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5c52261-89b7-4457-a0b6-4380a99ffa2a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5c52261-89b7-4457-a0b6-4380a99ffa2a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.193097 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5c52261-89b7-4457-a0b6-4380a99ffa2a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5c52261-89b7-4457-a0b6-4380a99ffa2a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.193124 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5c52261-89b7-4457-a0b6-4380a99ffa2a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5c52261-89b7-4457-a0b6-4380a99ffa2a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.193163 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5c52261-89b7-4457-a0b6-4380a99ffa2a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5c52261-89b7-4457-a0b6-4380a99ffa2a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.193201 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62jqn\" (UniqueName: \"kubernetes.io/projected/f5c52261-89b7-4457-a0b6-4380a99ffa2a-kube-api-access-62jqn\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5c52261-89b7-4457-a0b6-4380a99ffa2a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.196921 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5c52261-89b7-4457-a0b6-4380a99ffa2a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5c52261-89b7-4457-a0b6-4380a99ffa2a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.201355 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5c52261-89b7-4457-a0b6-4380a99ffa2a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5c52261-89b7-4457-a0b6-4380a99ffa2a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.205516 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5c52261-89b7-4457-a0b6-4380a99ffa2a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5c52261-89b7-4457-a0b6-4380a99ffa2a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.208642 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5c52261-89b7-4457-a0b6-4380a99ffa2a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5c52261-89b7-4457-a0b6-4380a99ffa2a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.212541 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62jqn\" (UniqueName: \"kubernetes.io/projected/f5c52261-89b7-4457-a0b6-4380a99ffa2a-kube-api-access-62jqn\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5c52261-89b7-4457-a0b6-4380a99ffa2a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:32 crc kubenswrapper[4975]: I0318 12:36:32.304501 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:33 crc kubenswrapper[4975]: I0318 12:36:33.037196 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="191e985b-8564-4dc3-b05d-c6e4fef796a8" path="/var/lib/kubelet/pods/191e985b-8564-4dc3-b05d-c6e4fef796a8/volumes" Mar 18 12:36:33 crc kubenswrapper[4975]: I0318 12:36:33.255441 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 12:36:33 crc kubenswrapper[4975]: I0318 12:36:33.257542 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 12:36:33 crc kubenswrapper[4975]: I0318 12:36:33.267435 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 12:36:33 crc kubenswrapper[4975]: I0318 12:36:33.370555 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 12:36:33 crc kubenswrapper[4975]: W0318 12:36:33.379877 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5c52261_89b7_4457_a0b6_4380a99ffa2a.slice/crio-8bbec4f62ed713cea53545059ad621483058b091fb1058845e05a55b0ac74c63 WatchSource:0}: Error finding container 8bbec4f62ed713cea53545059ad621483058b091fb1058845e05a55b0ac74c63: Status 404 returned error can't find the container with id 8bbec4f62ed713cea53545059ad621483058b091fb1058845e05a55b0ac74c63 Mar 18 12:36:33 crc kubenswrapper[4975]: I0318 12:36:33.866305 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f5c52261-89b7-4457-a0b6-4380a99ffa2a","Type":"ContainerStarted","Data":"782bee6e1a702c9ffe8723c670b43205bf7298f341e5b4c93aa133dbb1e02a3a"} Mar 18 12:36:33 crc kubenswrapper[4975]: I0318 12:36:33.866636 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f5c52261-89b7-4457-a0b6-4380a99ffa2a","Type":"ContainerStarted","Data":"8bbec4f62ed713cea53545059ad621483058b091fb1058845e05a55b0ac74c63"} Mar 18 12:36:33 crc kubenswrapper[4975]: I0318 12:36:33.873781 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 12:36:33 crc kubenswrapper[4975]: I0318 12:36:33.891118 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.89109363 podStartE2EDuration="2.89109363s" podCreationTimestamp="2026-03-18 12:36:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:36:33.88673453 +0000 UTC m=+1579.601135119" watchObservedRunningTime="2026-03-18 12:36:33.89109363 +0000 UTC m=+1579.605494209" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.085539 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-7vq7j"] Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.090394 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.121650 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-7vq7j"] Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.129226 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.129259 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-config\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.129312 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfsj6\" (UniqueName: \"kubernetes.io/projected/0af5d44d-bf9c-4cd9-9775-30824349df84-kube-api-access-kfsj6\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.129341 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.129379 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.129395 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.231219 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.231271 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-config\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.231322 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfsj6\" (UniqueName: \"kubernetes.io/projected/0af5d44d-bf9c-4cd9-9775-30824349df84-kube-api-access-kfsj6\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.231361 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.231396 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.231417 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.232081 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.232131 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.232423 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.232571 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.232643 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-config\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.254788 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfsj6\" (UniqueName: \"kubernetes.io/projected/0af5d44d-bf9c-4cd9-9775-30824349df84-kube-api-access-kfsj6\") pod \"dnsmasq-dns-89c5cd4d5-7vq7j\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.416410 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:34 crc kubenswrapper[4975]: W0318 12:36:34.906982 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0af5d44d_bf9c_4cd9_9775_30824349df84.slice/crio-3b41fc5d0b9148d807e032790510ce256fac9fc7b56df3762933da034e5b3a1f WatchSource:0}: Error finding container 3b41fc5d0b9148d807e032790510ce256fac9fc7b56df3762933da034e5b3a1f: Status 404 returned error can't find the container with id 3b41fc5d0b9148d807e032790510ce256fac9fc7b56df3762933da034e5b3a1f Mar 18 12:36:34 crc kubenswrapper[4975]: I0318 12:36:34.907443 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-7vq7j"] Mar 18 12:36:35 crc kubenswrapper[4975]: I0318 12:36:35.895214 4975 generic.go:334] "Generic (PLEG): container finished" podID="0af5d44d-bf9c-4cd9-9775-30824349df84" containerID="8535b4a54982e784365789172f26aec03dcd25215182e6f20edf061f0bb417d6" exitCode=0 Mar 18 12:36:35 crc kubenswrapper[4975]: I0318 12:36:35.895279 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" event={"ID":"0af5d44d-bf9c-4cd9-9775-30824349df84","Type":"ContainerDied","Data":"8535b4a54982e784365789172f26aec03dcd25215182e6f20edf061f0bb417d6"} Mar 18 12:36:35 crc kubenswrapper[4975]: I0318 12:36:35.895612 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" event={"ID":"0af5d44d-bf9c-4cd9-9775-30824349df84","Type":"ContainerStarted","Data":"3b41fc5d0b9148d807e032790510ce256fac9fc7b56df3762933da034e5b3a1f"} Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.162998 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.163750 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="sg-core" containerID="cri-o://124d2e2517104363e8f9ab34b0b1d98cd22494286dbe7daa1c67c6b3ec0ba82f" gracePeriod=30 Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.163820 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="ceilometer-notification-agent" containerID="cri-o://fa27b19dfc9be706f4c8776b4350a5923206f6870cdb7ddd91f8af3798c0f0e0" gracePeriod=30 Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.163705 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="ceilometer-central-agent" containerID="cri-o://a2ec6b75355cff0b12414d356a935a6d88d3be020d844ed1a3a688b4d579039e" gracePeriod=30 Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.164084 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="proxy-httpd" containerID="cri-o://e8ba2729244676f1858374ecbed1b560718997ed1c67174df2906da1583c0d87" gracePeriod=30 Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.300315 4975 scope.go:117] "RemoveContainer" containerID="cdfad74cb6dd30d81beb5cfd74d9b07938416a0d9d2fcf09ea8dbfd21aa40be4" Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.611784 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.908435 4975 generic.go:334] "Generic (PLEG): container finished" podID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerID="e8ba2729244676f1858374ecbed1b560718997ed1c67174df2906da1583c0d87" exitCode=0 Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.908469 4975 generic.go:334] "Generic (PLEG): container finished" podID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerID="124d2e2517104363e8f9ab34b0b1d98cd22494286dbe7daa1c67c6b3ec0ba82f" exitCode=2 Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.908480 4975 generic.go:334] "Generic (PLEG): container finished" podID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerID="fa27b19dfc9be706f4c8776b4350a5923206f6870cdb7ddd91f8af3798c0f0e0" exitCode=0 Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.908489 4975 generic.go:334] "Generic (PLEG): container finished" podID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerID="a2ec6b75355cff0b12414d356a935a6d88d3be020d844ed1a3a688b4d579039e" exitCode=0 Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.908541 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8862e7b-f638-466a-a1b1-8fa24fef7309","Type":"ContainerDied","Data":"e8ba2729244676f1858374ecbed1b560718997ed1c67174df2906da1583c0d87"} Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.908571 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8862e7b-f638-466a-a1b1-8fa24fef7309","Type":"ContainerDied","Data":"124d2e2517104363e8f9ab34b0b1d98cd22494286dbe7daa1c67c6b3ec0ba82f"} Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.908585 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8862e7b-f638-466a-a1b1-8fa24fef7309","Type":"ContainerDied","Data":"fa27b19dfc9be706f4c8776b4350a5923206f6870cdb7ddd91f8af3798c0f0e0"} Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.908597 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8862e7b-f638-466a-a1b1-8fa24fef7309","Type":"ContainerDied","Data":"a2ec6b75355cff0b12414d356a935a6d88d3be020d844ed1a3a688b4d579039e"} Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.908609 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8862e7b-f638-466a-a1b1-8fa24fef7309","Type":"ContainerDied","Data":"d820875e1db7b16b5486889be1ccc0b111294041fb4f50ec3290720568590b33"} Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.908621 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d820875e1db7b16b5486889be1ccc0b111294041fb4f50ec3290720568590b33" Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.911299 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.914156 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" event={"ID":"0af5d44d-bf9c-4cd9-9775-30824349df84","Type":"ContainerStarted","Data":"c174c09bca5c9841f1c0ab5d5a69891a1ad78d4d328a96c55bda38f79da84298"} Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.914286 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.915254 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bbf0f90d-1780-4665-833b-020aaafbaacd" containerName="nova-api-log" containerID="cri-o://4843ecb1e52f3efe39497b57d5d6637ae88ffd3b5070ca737bc2ed066dedfc48" gracePeriod=30 Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.915393 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bbf0f90d-1780-4665-833b-020aaafbaacd" containerName="nova-api-api" containerID="cri-o://ef72e9d61169a2aaad66521c7e0d13fb51421cabda508c0013d968c503766392" gracePeriod=30 Mar 18 12:36:36 crc kubenswrapper[4975]: I0318 12:36:36.983177 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" podStartSLOduration=2.983157852 podStartE2EDuration="2.983157852s" podCreationTimestamp="2026-03-18 12:36:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:36:36.975790649 +0000 UTC m=+1582.690191248" watchObservedRunningTime="2026-03-18 12:36:36.983157852 +0000 UTC m=+1582.697558431" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.101402 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-scripts\") pod \"d8862e7b-f638-466a-a1b1-8fa24fef7309\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.101766 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-sg-core-conf-yaml\") pod \"d8862e7b-f638-466a-a1b1-8fa24fef7309\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.101828 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2tf7\" (UniqueName: \"kubernetes.io/projected/d8862e7b-f638-466a-a1b1-8fa24fef7309-kube-api-access-f2tf7\") pod \"d8862e7b-f638-466a-a1b1-8fa24fef7309\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.101856 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8862e7b-f638-466a-a1b1-8fa24fef7309-log-httpd\") pod \"d8862e7b-f638-466a-a1b1-8fa24fef7309\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.101913 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8862e7b-f638-466a-a1b1-8fa24fef7309-run-httpd\") pod \"d8862e7b-f638-466a-a1b1-8fa24fef7309\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.102153 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-combined-ca-bundle\") pod \"d8862e7b-f638-466a-a1b1-8fa24fef7309\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.102200 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-config-data\") pod \"d8862e7b-f638-466a-a1b1-8fa24fef7309\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.102229 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-ceilometer-tls-certs\") pod \"d8862e7b-f638-466a-a1b1-8fa24fef7309\" (UID: \"d8862e7b-f638-466a-a1b1-8fa24fef7309\") " Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.102625 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8862e7b-f638-466a-a1b1-8fa24fef7309-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d8862e7b-f638-466a-a1b1-8fa24fef7309" (UID: "d8862e7b-f638-466a-a1b1-8fa24fef7309"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.102647 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8862e7b-f638-466a-a1b1-8fa24fef7309-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d8862e7b-f638-466a-a1b1-8fa24fef7309" (UID: "d8862e7b-f638-466a-a1b1-8fa24fef7309"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.102766 4975 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8862e7b-f638-466a-a1b1-8fa24fef7309-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.102785 4975 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8862e7b-f638-466a-a1b1-8fa24fef7309-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.108581 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8862e7b-f638-466a-a1b1-8fa24fef7309-kube-api-access-f2tf7" (OuterVolumeSpecName: "kube-api-access-f2tf7") pod "d8862e7b-f638-466a-a1b1-8fa24fef7309" (UID: "d8862e7b-f638-466a-a1b1-8fa24fef7309"). InnerVolumeSpecName "kube-api-access-f2tf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.111190 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-scripts" (OuterVolumeSpecName: "scripts") pod "d8862e7b-f638-466a-a1b1-8fa24fef7309" (UID: "d8862e7b-f638-466a-a1b1-8fa24fef7309"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.135094 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d8862e7b-f638-466a-a1b1-8fa24fef7309" (UID: "d8862e7b-f638-466a-a1b1-8fa24fef7309"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.159450 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d8862e7b-f638-466a-a1b1-8fa24fef7309" (UID: "d8862e7b-f638-466a-a1b1-8fa24fef7309"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.192155 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8862e7b-f638-466a-a1b1-8fa24fef7309" (UID: "d8862e7b-f638-466a-a1b1-8fa24fef7309"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.205163 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.205200 4975 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.205216 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.205227 4975 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.205238 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2tf7\" (UniqueName: \"kubernetes.io/projected/d8862e7b-f638-466a-a1b1-8fa24fef7309-kube-api-access-f2tf7\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.224961 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-config-data" (OuterVolumeSpecName: "config-data") pod "d8862e7b-f638-466a-a1b1-8fa24fef7309" (UID: "d8862e7b-f638-466a-a1b1-8fa24fef7309"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.305037 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.306226 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8862e7b-f638-466a-a1b1-8fa24fef7309-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.927300 4975 generic.go:334] "Generic (PLEG): container finished" podID="bbf0f90d-1780-4665-833b-020aaafbaacd" containerID="4843ecb1e52f3efe39497b57d5d6637ae88ffd3b5070ca737bc2ed066dedfc48" exitCode=143 Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.927408 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.927833 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bbf0f90d-1780-4665-833b-020aaafbaacd","Type":"ContainerDied","Data":"4843ecb1e52f3efe39497b57d5d6637ae88ffd3b5070ca737bc2ed066dedfc48"} Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.983671 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:36:37 crc kubenswrapper[4975]: I0318 12:36:37.992711 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.007274 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:36:38 crc kubenswrapper[4975]: E0318 12:36:38.007849 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="ceilometer-notification-agent" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.007928 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="ceilometer-notification-agent" Mar 18 12:36:38 crc kubenswrapper[4975]: E0318 12:36:38.007981 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="proxy-httpd" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.008025 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="proxy-httpd" Mar 18 12:36:38 crc kubenswrapper[4975]: E0318 12:36:38.008081 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="sg-core" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.008127 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="sg-core" Mar 18 12:36:38 crc kubenswrapper[4975]: E0318 12:36:38.008205 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="ceilometer-central-agent" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.008251 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="ceilometer-central-agent" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.008478 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="ceilometer-notification-agent" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.008553 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="proxy-httpd" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.008612 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="sg-core" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.008660 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" containerName="ceilometer-central-agent" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.010363 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.012657 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.013073 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.013310 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.023583 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.118429 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ff504e-305e-4180-ae3b-f1ee98e27726-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.118580 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86hn6\" (UniqueName: \"kubernetes.io/projected/a1ff504e-305e-4180-ae3b-f1ee98e27726-kube-api-access-86hn6\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.118631 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1ff504e-305e-4180-ae3b-f1ee98e27726-scripts\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.118659 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1ff504e-305e-4180-ae3b-f1ee98e27726-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.118768 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1ff504e-305e-4180-ae3b-f1ee98e27726-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.118906 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1ff504e-305e-4180-ae3b-f1ee98e27726-log-httpd\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.119007 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1ff504e-305e-4180-ae3b-f1ee98e27726-run-httpd\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.119065 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ff504e-305e-4180-ae3b-f1ee98e27726-config-data\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.221267 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1ff504e-305e-4180-ae3b-f1ee98e27726-run-httpd\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.221552 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ff504e-305e-4180-ae3b-f1ee98e27726-config-data\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.221788 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ff504e-305e-4180-ae3b-f1ee98e27726-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.221929 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86hn6\" (UniqueName: \"kubernetes.io/projected/a1ff504e-305e-4180-ae3b-f1ee98e27726-kube-api-access-86hn6\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.222033 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1ff504e-305e-4180-ae3b-f1ee98e27726-scripts\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.222151 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1ff504e-305e-4180-ae3b-f1ee98e27726-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.222646 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1ff504e-305e-4180-ae3b-f1ee98e27726-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.222800 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1ff504e-305e-4180-ae3b-f1ee98e27726-log-httpd\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.224245 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1ff504e-305e-4180-ae3b-f1ee98e27726-run-httpd\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.224240 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1ff504e-305e-4180-ae3b-f1ee98e27726-log-httpd\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.226536 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ff504e-305e-4180-ae3b-f1ee98e27726-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.227096 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ff504e-305e-4180-ae3b-f1ee98e27726-config-data\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.227170 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1ff504e-305e-4180-ae3b-f1ee98e27726-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.227591 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1ff504e-305e-4180-ae3b-f1ee98e27726-scripts\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.239695 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1ff504e-305e-4180-ae3b-f1ee98e27726-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.243356 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86hn6\" (UniqueName: \"kubernetes.io/projected/a1ff504e-305e-4180-ae3b-f1ee98e27726-kube-api-access-86hn6\") pod \"ceilometer-0\" (UID: \"a1ff504e-305e-4180-ae3b-f1ee98e27726\") " pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.335554 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.828314 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 12:36:38 crc kubenswrapper[4975]: W0318 12:36:38.835306 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1ff504e_305e_4180_ae3b_f1ee98e27726.slice/crio-f3c1e31e20b6179e702df5a395d95bf0fff9670efb80d6701e32d858ca817314 WatchSource:0}: Error finding container f3c1e31e20b6179e702df5a395d95bf0fff9670efb80d6701e32d858ca817314: Status 404 returned error can't find the container with id f3c1e31e20b6179e702df5a395d95bf0fff9670efb80d6701e32d858ca817314 Mar 18 12:36:38 crc kubenswrapper[4975]: I0318 12:36:38.945225 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1ff504e-305e-4180-ae3b-f1ee98e27726","Type":"ContainerStarted","Data":"f3c1e31e20b6179e702df5a395d95bf0fff9670efb80d6701e32d858ca817314"} Mar 18 12:36:39 crc kubenswrapper[4975]: I0318 12:36:39.026095 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8862e7b-f638-466a-a1b1-8fa24fef7309" path="/var/lib/kubelet/pods/d8862e7b-f638-466a-a1b1-8fa24fef7309/volumes" Mar 18 12:36:39 crc kubenswrapper[4975]: I0318 12:36:39.954999 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1ff504e-305e-4180-ae3b-f1ee98e27726","Type":"ContainerStarted","Data":"f36ff3b947f911531d127f8207bc433c166ac1c39719d6464eb63ed69d6c6dc3"} Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.017458 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:36:40 crc kubenswrapper[4975]: E0318 12:36:40.017730 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.525530 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.668344 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf0f90d-1780-4665-833b-020aaafbaacd-combined-ca-bundle\") pod \"bbf0f90d-1780-4665-833b-020aaafbaacd\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.668658 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf0f90d-1780-4665-833b-020aaafbaacd-config-data\") pod \"bbf0f90d-1780-4665-833b-020aaafbaacd\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.668834 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbf0f90d-1780-4665-833b-020aaafbaacd-logs\") pod \"bbf0f90d-1780-4665-833b-020aaafbaacd\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.668949 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ql7v\" (UniqueName: \"kubernetes.io/projected/bbf0f90d-1780-4665-833b-020aaafbaacd-kube-api-access-6ql7v\") pod \"bbf0f90d-1780-4665-833b-020aaafbaacd\" (UID: \"bbf0f90d-1780-4665-833b-020aaafbaacd\") " Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.669508 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf0f90d-1780-4665-833b-020aaafbaacd-logs" (OuterVolumeSpecName: "logs") pod "bbf0f90d-1780-4665-833b-020aaafbaacd" (UID: "bbf0f90d-1780-4665-833b-020aaafbaacd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.674078 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf0f90d-1780-4665-833b-020aaafbaacd-kube-api-access-6ql7v" (OuterVolumeSpecName: "kube-api-access-6ql7v") pod "bbf0f90d-1780-4665-833b-020aaafbaacd" (UID: "bbf0f90d-1780-4665-833b-020aaafbaacd"). InnerVolumeSpecName "kube-api-access-6ql7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.695324 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf0f90d-1780-4665-833b-020aaafbaacd-config-data" (OuterVolumeSpecName: "config-data") pod "bbf0f90d-1780-4665-833b-020aaafbaacd" (UID: "bbf0f90d-1780-4665-833b-020aaafbaacd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.696167 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf0f90d-1780-4665-833b-020aaafbaacd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbf0f90d-1780-4665-833b-020aaafbaacd" (UID: "bbf0f90d-1780-4665-833b-020aaafbaacd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.771507 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ql7v\" (UniqueName: \"kubernetes.io/projected/bbf0f90d-1780-4665-833b-020aaafbaacd-kube-api-access-6ql7v\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.771546 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf0f90d-1780-4665-833b-020aaafbaacd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.771563 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf0f90d-1780-4665-833b-020aaafbaacd-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.771589 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbf0f90d-1780-4665-833b-020aaafbaacd-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.971485 4975 generic.go:334] "Generic (PLEG): container finished" podID="bbf0f90d-1780-4665-833b-020aaafbaacd" containerID="ef72e9d61169a2aaad66521c7e0d13fb51421cabda508c0013d968c503766392" exitCode=0 Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.971643 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.972020 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bbf0f90d-1780-4665-833b-020aaafbaacd","Type":"ContainerDied","Data":"ef72e9d61169a2aaad66521c7e0d13fb51421cabda508c0013d968c503766392"} Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.972077 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bbf0f90d-1780-4665-833b-020aaafbaacd","Type":"ContainerDied","Data":"bff209782cf53492c1e3c6bd2672d2a412c94698989f923af6f1ca63a2a0b0ca"} Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.972101 4975 scope.go:117] "RemoveContainer" containerID="ef72e9d61169a2aaad66521c7e0d13fb51421cabda508c0013d968c503766392" Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.977321 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1ff504e-305e-4180-ae3b-f1ee98e27726","Type":"ContainerStarted","Data":"61b988e8af833be27246cf2a012fe919d2c702cc8d30fcc8c3cc42cfe1ca1c1b"} Mar 18 12:36:40 crc kubenswrapper[4975]: I0318 12:36:40.991737 4975 scope.go:117] "RemoveContainer" containerID="4843ecb1e52f3efe39497b57d5d6637ae88ffd3b5070ca737bc2ed066dedfc48" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.008968 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.029185 4975 scope.go:117] "RemoveContainer" containerID="ef72e9d61169a2aaad66521c7e0d13fb51421cabda508c0013d968c503766392" Mar 18 12:36:41 crc kubenswrapper[4975]: E0318 12:36:41.031319 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef72e9d61169a2aaad66521c7e0d13fb51421cabda508c0013d968c503766392\": container with ID starting with ef72e9d61169a2aaad66521c7e0d13fb51421cabda508c0013d968c503766392 not found: ID does not exist" containerID="ef72e9d61169a2aaad66521c7e0d13fb51421cabda508c0013d968c503766392" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.031379 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef72e9d61169a2aaad66521c7e0d13fb51421cabda508c0013d968c503766392"} err="failed to get container status \"ef72e9d61169a2aaad66521c7e0d13fb51421cabda508c0013d968c503766392\": rpc error: code = NotFound desc = could not find container \"ef72e9d61169a2aaad66521c7e0d13fb51421cabda508c0013d968c503766392\": container with ID starting with ef72e9d61169a2aaad66521c7e0d13fb51421cabda508c0013d968c503766392 not found: ID does not exist" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.031413 4975 scope.go:117] "RemoveContainer" containerID="4843ecb1e52f3efe39497b57d5d6637ae88ffd3b5070ca737bc2ed066dedfc48" Mar 18 12:36:41 crc kubenswrapper[4975]: E0318 12:36:41.031758 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4843ecb1e52f3efe39497b57d5d6637ae88ffd3b5070ca737bc2ed066dedfc48\": container with ID starting with 4843ecb1e52f3efe39497b57d5d6637ae88ffd3b5070ca737bc2ed066dedfc48 not found: ID does not exist" containerID="4843ecb1e52f3efe39497b57d5d6637ae88ffd3b5070ca737bc2ed066dedfc48" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.031807 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4843ecb1e52f3efe39497b57d5d6637ae88ffd3b5070ca737bc2ed066dedfc48"} err="failed to get container status \"4843ecb1e52f3efe39497b57d5d6637ae88ffd3b5070ca737bc2ed066dedfc48\": rpc error: code = NotFound desc = could not find container \"4843ecb1e52f3efe39497b57d5d6637ae88ffd3b5070ca737bc2ed066dedfc48\": container with ID starting with 4843ecb1e52f3efe39497b57d5d6637ae88ffd3b5070ca737bc2ed066dedfc48 not found: ID does not exist" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.037518 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.048542 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:41 crc kubenswrapper[4975]: E0318 12:36:41.049057 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf0f90d-1780-4665-833b-020aaafbaacd" containerName="nova-api-log" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.049081 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf0f90d-1780-4665-833b-020aaafbaacd" containerName="nova-api-log" Mar 18 12:36:41 crc kubenswrapper[4975]: E0318 12:36:41.049129 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf0f90d-1780-4665-833b-020aaafbaacd" containerName="nova-api-api" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.049139 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf0f90d-1780-4665-833b-020aaafbaacd" containerName="nova-api-api" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.049393 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf0f90d-1780-4665-833b-020aaafbaacd" containerName="nova-api-api" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.049414 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf0f90d-1780-4665-833b-020aaafbaacd" containerName="nova-api-log" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.050562 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.054318 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.054516 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.054547 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.078917 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.177762 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.177842 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-config-data\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.177908 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-public-tls-certs\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.178119 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.178210 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q94zb\" (UniqueName: \"kubernetes.io/projected/37a08fee-c205-47da-bc62-127fec3be9b7-kube-api-access-q94zb\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.178283 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a08fee-c205-47da-bc62-127fec3be9b7-logs\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.280007 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.280366 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q94zb\" (UniqueName: \"kubernetes.io/projected/37a08fee-c205-47da-bc62-127fec3be9b7-kube-api-access-q94zb\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.280436 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a08fee-c205-47da-bc62-127fec3be9b7-logs\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.280467 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.280490 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-config-data\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.280527 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-public-tls-certs\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.281012 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a08fee-c205-47da-bc62-127fec3be9b7-logs\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.284386 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-public-tls-certs\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.284721 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.285634 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.287055 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-config-data\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.305712 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q94zb\" (UniqueName: \"kubernetes.io/projected/37a08fee-c205-47da-bc62-127fec3be9b7-kube-api-access-q94zb\") pod \"nova-api-0\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.382018 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.854529 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.986963 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37a08fee-c205-47da-bc62-127fec3be9b7","Type":"ContainerStarted","Data":"df6b5ae5e2753158e5656c4539ee17d5df45287376e8f10171ae12546d745187"} Mar 18 12:36:41 crc kubenswrapper[4975]: I0318 12:36:41.992839 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1ff504e-305e-4180-ae3b-f1ee98e27726","Type":"ContainerStarted","Data":"358e1c5174eeb64a0effe40feaf3b1b04ac9b0521ce24b79d50d24b6205e3738"} Mar 18 12:36:42 crc kubenswrapper[4975]: I0318 12:36:42.305565 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:42 crc kubenswrapper[4975]: I0318 12:36:42.326686 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.006738 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37a08fee-c205-47da-bc62-127fec3be9b7","Type":"ContainerStarted","Data":"ca516b3155c9da471d057489d115c2e6bef88ebe8795fcf12023d49f4b4ba354"} Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.006775 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37a08fee-c205-47da-bc62-127fec3be9b7","Type":"ContainerStarted","Data":"9d93390abf911311725171f9f519ce9a1a6fbb812a98a7ed5963498942bfa107"} Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.030840 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf0f90d-1780-4665-833b-020aaafbaacd" path="/var/lib/kubelet/pods/bbf0f90d-1780-4665-833b-020aaafbaacd/volumes" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.032235 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.032341 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.032322888 podStartE2EDuration="2.032322888s" podCreationTimestamp="2026-03-18 12:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:36:43.021836009 +0000 UTC m=+1588.736236588" watchObservedRunningTime="2026-03-18 12:36:43.032322888 +0000 UTC m=+1588.746723467" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.215097 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6frd5"] Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.216711 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.221855 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.222087 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.224835 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6frd5"] Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.324698 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-config-data\") pod \"nova-cell1-cell-mapping-6frd5\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.324740 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmqdd\" (UniqueName: \"kubernetes.io/projected/44e07ebf-56a4-49eb-a2fe-61185a762c7b-kube-api-access-cmqdd\") pod \"nova-cell1-cell-mapping-6frd5\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.324904 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6frd5\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.325155 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-scripts\") pod \"nova-cell1-cell-mapping-6frd5\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.428604 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-scripts\") pod \"nova-cell1-cell-mapping-6frd5\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.428694 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-config-data\") pod \"nova-cell1-cell-mapping-6frd5\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.428732 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmqdd\" (UniqueName: \"kubernetes.io/projected/44e07ebf-56a4-49eb-a2fe-61185a762c7b-kube-api-access-cmqdd\") pod \"nova-cell1-cell-mapping-6frd5\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.428835 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6frd5\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.434409 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6frd5\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.437664 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-scripts\") pod \"nova-cell1-cell-mapping-6frd5\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.449614 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-config-data\") pod \"nova-cell1-cell-mapping-6frd5\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.459453 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmqdd\" (UniqueName: \"kubernetes.io/projected/44e07ebf-56a4-49eb-a2fe-61185a762c7b-kube-api-access-cmqdd\") pod \"nova-cell1-cell-mapping-6frd5\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:43 crc kubenswrapper[4975]: I0318 12:36:43.670274 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:44 crc kubenswrapper[4975]: I0318 12:36:44.017228 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1ff504e-305e-4180-ae3b-f1ee98e27726","Type":"ContainerStarted","Data":"2b136313b039708e513f2ee9f42847fb448042bc917f354cff090c398865aa73"} Mar 18 12:36:44 crc kubenswrapper[4975]: I0318 12:36:44.017507 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 12:36:44 crc kubenswrapper[4975]: I0318 12:36:44.046167 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7107962260000003 podStartE2EDuration="7.046150171s" podCreationTimestamp="2026-03-18 12:36:37 +0000 UTC" firstStartedPulling="2026-03-18 12:36:38.838268745 +0000 UTC m=+1584.552669314" lastFinishedPulling="2026-03-18 12:36:43.17362268 +0000 UTC m=+1588.888023259" observedRunningTime="2026-03-18 12:36:44.040752082 +0000 UTC m=+1589.755152661" watchObservedRunningTime="2026-03-18 12:36:44.046150171 +0000 UTC m=+1589.760550750" Mar 18 12:36:44 crc kubenswrapper[4975]: I0318 12:36:44.103418 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6frd5"] Mar 18 12:36:44 crc kubenswrapper[4975]: W0318 12:36:44.107719 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44e07ebf_56a4_49eb_a2fe_61185a762c7b.slice/crio-2cbf775e65c43f86d3a31706d32d1f1bc93f7e84dc15e339f7bd32c2d5c42faf WatchSource:0}: Error finding container 2cbf775e65c43f86d3a31706d32d1f1bc93f7e84dc15e339f7bd32c2d5c42faf: Status 404 returned error can't find the container with id 2cbf775e65c43f86d3a31706d32d1f1bc93f7e84dc15e339f7bd32c2d5c42faf Mar 18 12:36:44 crc kubenswrapper[4975]: I0318 12:36:44.418015 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:36:44 crc kubenswrapper[4975]: I0318 12:36:44.484356 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-2zjm2"] Mar 18 12:36:44 crc kubenswrapper[4975]: I0318 12:36:44.484610 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" podUID="e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd" containerName="dnsmasq-dns" containerID="cri-o://98d1dee7a74766150d3a8bb150e66b3661de612b333c11a037de178495b42ace" gracePeriod=10 Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.001373 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.028641 4975 generic.go:334] "Generic (PLEG): container finished" podID="e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd" containerID="98d1dee7a74766150d3a8bb150e66b3661de612b333c11a037de178495b42ace" exitCode=0 Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.028782 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.035007 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" event={"ID":"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd","Type":"ContainerDied","Data":"98d1dee7a74766150d3a8bb150e66b3661de612b333c11a037de178495b42ace"} Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.035067 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-2zjm2" event={"ID":"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd","Type":"ContainerDied","Data":"64ea038b778e65d67ec0d490c320144965a0549968e02253bf9679160a5c75ae"} Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.035097 4975 scope.go:117] "RemoveContainer" containerID="98d1dee7a74766150d3a8bb150e66b3661de612b333c11a037de178495b42ace" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.035695 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6frd5" event={"ID":"44e07ebf-56a4-49eb-a2fe-61185a762c7b","Type":"ContainerStarted","Data":"166ab5797c69ef8c0c3384dc550fff2a51a45219226508f65e6dc01ac953ab3a"} Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.035813 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6frd5" event={"ID":"44e07ebf-56a4-49eb-a2fe-61185a762c7b","Type":"ContainerStarted","Data":"2cbf775e65c43f86d3a31706d32d1f1bc93f7e84dc15e339f7bd32c2d5c42faf"} Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.059939 4975 scope.go:117] "RemoveContainer" containerID="2e135f359094f15c73b161e76526f3aa998bcdb6c6ce3b283bcbdd3a87ce0ca0" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.102975 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-6frd5" podStartSLOduration=2.102956508 podStartE2EDuration="2.102956508s" podCreationTimestamp="2026-03-18 12:36:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:36:45.102360132 +0000 UTC m=+1590.816760721" watchObservedRunningTime="2026-03-18 12:36:45.102956508 +0000 UTC m=+1590.817357087" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.110024 4975 scope.go:117] "RemoveContainer" containerID="98d1dee7a74766150d3a8bb150e66b3661de612b333c11a037de178495b42ace" Mar 18 12:36:45 crc kubenswrapper[4975]: E0318 12:36:45.111052 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98d1dee7a74766150d3a8bb150e66b3661de612b333c11a037de178495b42ace\": container with ID starting with 98d1dee7a74766150d3a8bb150e66b3661de612b333c11a037de178495b42ace not found: ID does not exist" containerID="98d1dee7a74766150d3a8bb150e66b3661de612b333c11a037de178495b42ace" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.111157 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98d1dee7a74766150d3a8bb150e66b3661de612b333c11a037de178495b42ace"} err="failed to get container status \"98d1dee7a74766150d3a8bb150e66b3661de612b333c11a037de178495b42ace\": rpc error: code = NotFound desc = could not find container \"98d1dee7a74766150d3a8bb150e66b3661de612b333c11a037de178495b42ace\": container with ID starting with 98d1dee7a74766150d3a8bb150e66b3661de612b333c11a037de178495b42ace not found: ID does not exist" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.111248 4975 scope.go:117] "RemoveContainer" containerID="2e135f359094f15c73b161e76526f3aa998bcdb6c6ce3b283bcbdd3a87ce0ca0" Mar 18 12:36:45 crc kubenswrapper[4975]: E0318 12:36:45.113827 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e135f359094f15c73b161e76526f3aa998bcdb6c6ce3b283bcbdd3a87ce0ca0\": container with ID starting with 2e135f359094f15c73b161e76526f3aa998bcdb6c6ce3b283bcbdd3a87ce0ca0 not found: ID does not exist" containerID="2e135f359094f15c73b161e76526f3aa998bcdb6c6ce3b283bcbdd3a87ce0ca0" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.114021 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e135f359094f15c73b161e76526f3aa998bcdb6c6ce3b283bcbdd3a87ce0ca0"} err="failed to get container status \"2e135f359094f15c73b161e76526f3aa998bcdb6c6ce3b283bcbdd3a87ce0ca0\": rpc error: code = NotFound desc = could not find container \"2e135f359094f15c73b161e76526f3aa998bcdb6c6ce3b283bcbdd3a87ce0ca0\": container with ID starting with 2e135f359094f15c73b161e76526f3aa998bcdb6c6ce3b283bcbdd3a87ce0ca0 not found: ID does not exist" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.170609 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-dns-swift-storage-0\") pod \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.170950 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrn9j\" (UniqueName: \"kubernetes.io/projected/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-kube-api-access-qrn9j\") pod \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.171053 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-ovsdbserver-sb\") pod \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.171176 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-dns-svc\") pod \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.171944 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-config\") pod \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.172215 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-ovsdbserver-nb\") pod \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\" (UID: \"e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd\") " Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.177652 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-kube-api-access-qrn9j" (OuterVolumeSpecName: "kube-api-access-qrn9j") pod "e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd" (UID: "e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd"). InnerVolumeSpecName "kube-api-access-qrn9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.246717 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd" (UID: "e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.248127 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd" (UID: "e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.252299 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-config" (OuterVolumeSpecName: "config") pod "e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd" (UID: "e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.258222 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd" (UID: "e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.275656 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrn9j\" (UniqueName: \"kubernetes.io/projected/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-kube-api-access-qrn9j\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.275972 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.276051 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.276103 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.276152 4975 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.276430 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd" (UID: "e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.363757 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-2zjm2"] Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.377412 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:45 crc kubenswrapper[4975]: I0318 12:36:45.385227 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-2zjm2"] Mar 18 12:36:47 crc kubenswrapper[4975]: I0318 12:36:47.027023 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd" path="/var/lib/kubelet/pods/e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd/volumes" Mar 18 12:36:50 crc kubenswrapper[4975]: I0318 12:36:50.082474 4975 generic.go:334] "Generic (PLEG): container finished" podID="44e07ebf-56a4-49eb-a2fe-61185a762c7b" containerID="166ab5797c69ef8c0c3384dc550fff2a51a45219226508f65e6dc01ac953ab3a" exitCode=0 Mar 18 12:36:50 crc kubenswrapper[4975]: I0318 12:36:50.082526 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6frd5" event={"ID":"44e07ebf-56a4-49eb-a2fe-61185a762c7b","Type":"ContainerDied","Data":"166ab5797c69ef8c0c3384dc550fff2a51a45219226508f65e6dc01ac953ab3a"} Mar 18 12:36:51 crc kubenswrapper[4975]: I0318 12:36:51.383642 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:36:51 crc kubenswrapper[4975]: I0318 12:36:51.383970 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:36:51 crc kubenswrapper[4975]: I0318 12:36:51.493613 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:51 crc kubenswrapper[4975]: I0318 12:36:51.589000 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-combined-ca-bundle\") pod \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " Mar 18 12:36:51 crc kubenswrapper[4975]: I0318 12:36:51.589212 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-scripts\") pod \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " Mar 18 12:36:51 crc kubenswrapper[4975]: I0318 12:36:51.589300 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-config-data\") pod \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " Mar 18 12:36:51 crc kubenswrapper[4975]: I0318 12:36:51.589333 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmqdd\" (UniqueName: \"kubernetes.io/projected/44e07ebf-56a4-49eb-a2fe-61185a762c7b-kube-api-access-cmqdd\") pod \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\" (UID: \"44e07ebf-56a4-49eb-a2fe-61185a762c7b\") " Mar 18 12:36:51 crc kubenswrapper[4975]: I0318 12:36:51.594384 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44e07ebf-56a4-49eb-a2fe-61185a762c7b-kube-api-access-cmqdd" (OuterVolumeSpecName: "kube-api-access-cmqdd") pod "44e07ebf-56a4-49eb-a2fe-61185a762c7b" (UID: "44e07ebf-56a4-49eb-a2fe-61185a762c7b"). InnerVolumeSpecName "kube-api-access-cmqdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:51 crc kubenswrapper[4975]: I0318 12:36:51.594475 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-scripts" (OuterVolumeSpecName: "scripts") pod "44e07ebf-56a4-49eb-a2fe-61185a762c7b" (UID: "44e07ebf-56a4-49eb-a2fe-61185a762c7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:51 crc kubenswrapper[4975]: I0318 12:36:51.618767 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-config-data" (OuterVolumeSpecName: "config-data") pod "44e07ebf-56a4-49eb-a2fe-61185a762c7b" (UID: "44e07ebf-56a4-49eb-a2fe-61185a762c7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:51 crc kubenswrapper[4975]: I0318 12:36:51.621537 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44e07ebf-56a4-49eb-a2fe-61185a762c7b" (UID: "44e07ebf-56a4-49eb-a2fe-61185a762c7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:51 crc kubenswrapper[4975]: I0318 12:36:51.691417 4975 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:51 crc kubenswrapper[4975]: I0318 12:36:51.691452 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:51 crc kubenswrapper[4975]: I0318 12:36:51.691462 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmqdd\" (UniqueName: \"kubernetes.io/projected/44e07ebf-56a4-49eb-a2fe-61185a762c7b-kube-api-access-cmqdd\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:51 crc kubenswrapper[4975]: I0318 12:36:51.691471 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e07ebf-56a4-49eb-a2fe-61185a762c7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:52 crc kubenswrapper[4975]: I0318 12:36:52.102427 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6frd5" event={"ID":"44e07ebf-56a4-49eb-a2fe-61185a762c7b","Type":"ContainerDied","Data":"2cbf775e65c43f86d3a31706d32d1f1bc93f7e84dc15e339f7bd32c2d5c42faf"} Mar 18 12:36:52 crc kubenswrapper[4975]: I0318 12:36:52.102468 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cbf775e65c43f86d3a31706d32d1f1bc93f7e84dc15e339f7bd32c2d5c42faf" Mar 18 12:36:52 crc kubenswrapper[4975]: I0318 12:36:52.102517 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6frd5" Mar 18 12:36:52 crc kubenswrapper[4975]: I0318 12:36:52.296796 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:52 crc kubenswrapper[4975]: I0318 12:36:52.297031 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="37a08fee-c205-47da-bc62-127fec3be9b7" containerName="nova-api-log" containerID="cri-o://9d93390abf911311725171f9f519ce9a1a6fbb812a98a7ed5963498942bfa107" gracePeriod=30 Mar 18 12:36:52 crc kubenswrapper[4975]: I0318 12:36:52.297408 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="37a08fee-c205-47da-bc62-127fec3be9b7" containerName="nova-api-api" containerID="cri-o://ca516b3155c9da471d057489d115c2e6bef88ebe8795fcf12023d49f4b4ba354" gracePeriod=30 Mar 18 12:36:52 crc kubenswrapper[4975]: I0318 12:36:52.310524 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37a08fee-c205-47da-bc62-127fec3be9b7" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": EOF" Mar 18 12:36:52 crc kubenswrapper[4975]: I0318 12:36:52.310549 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37a08fee-c205-47da-bc62-127fec3be9b7" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": EOF" Mar 18 12:36:52 crc kubenswrapper[4975]: I0318 12:36:52.318045 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:36:52 crc kubenswrapper[4975]: I0318 12:36:52.318257 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d" containerName="nova-scheduler-scheduler" containerID="cri-o://f38549335eef235d6b275539bdbfb2e1791de8d5bb1c3039dd77d107588cbd6d" gracePeriod=30 Mar 18 12:36:52 crc kubenswrapper[4975]: I0318 12:36:52.350853 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:52 crc kubenswrapper[4975]: I0318 12:36:52.351261 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="790e3570-2044-4dea-b894-5222e6d3d2e9" containerName="nova-metadata-log" containerID="cri-o://b3807fde9eac2ceff5338dd8ffff1452492c8ee2dff35186198ee6e05ee13616" gracePeriod=30 Mar 18 12:36:52 crc kubenswrapper[4975]: I0318 12:36:52.351728 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="790e3570-2044-4dea-b894-5222e6d3d2e9" containerName="nova-metadata-metadata" containerID="cri-o://06acc6bac297a959033832efd94492f6989840385401550230eda0a70307a721" gracePeriod=30 Mar 18 12:36:53 crc kubenswrapper[4975]: I0318 12:36:53.022346 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:36:53 crc kubenswrapper[4975]: E0318 12:36:53.023402 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:36:53 crc kubenswrapper[4975]: I0318 12:36:53.113799 4975 generic.go:334] "Generic (PLEG): container finished" podID="37a08fee-c205-47da-bc62-127fec3be9b7" containerID="9d93390abf911311725171f9f519ce9a1a6fbb812a98a7ed5963498942bfa107" exitCode=143 Mar 18 12:36:53 crc kubenswrapper[4975]: I0318 12:36:53.113931 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37a08fee-c205-47da-bc62-127fec3be9b7","Type":"ContainerDied","Data":"9d93390abf911311725171f9f519ce9a1a6fbb812a98a7ed5963498942bfa107"} Mar 18 12:36:53 crc kubenswrapper[4975]: I0318 12:36:53.147544 4975 generic.go:334] "Generic (PLEG): container finished" podID="790e3570-2044-4dea-b894-5222e6d3d2e9" containerID="b3807fde9eac2ceff5338dd8ffff1452492c8ee2dff35186198ee6e05ee13616" exitCode=143 Mar 18 12:36:53 crc kubenswrapper[4975]: I0318 12:36:53.147584 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"790e3570-2044-4dea-b894-5222e6d3d2e9","Type":"ContainerDied","Data":"b3807fde9eac2ceff5338dd8ffff1452492c8ee2dff35186198ee6e05ee13616"} Mar 18 12:36:53 crc kubenswrapper[4975]: I0318 12:36:53.613229 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:36:53 crc kubenswrapper[4975]: I0318 12:36:53.753690 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jr9d\" (UniqueName: \"kubernetes.io/projected/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-kube-api-access-5jr9d\") pod \"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d\" (UID: \"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d\") " Mar 18 12:36:53 crc kubenswrapper[4975]: I0318 12:36:53.755601 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-combined-ca-bundle\") pod \"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d\" (UID: \"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d\") " Mar 18 12:36:53 crc kubenswrapper[4975]: I0318 12:36:53.756195 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-config-data\") pod \"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d\" (UID: \"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d\") " Mar 18 12:36:53 crc kubenswrapper[4975]: I0318 12:36:53.777096 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-kube-api-access-5jr9d" (OuterVolumeSpecName: "kube-api-access-5jr9d") pod "1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d" (UID: "1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d"). InnerVolumeSpecName "kube-api-access-5jr9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:53 crc kubenswrapper[4975]: I0318 12:36:53.797676 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-config-data" (OuterVolumeSpecName: "config-data") pod "1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d" (UID: "1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:53 crc kubenswrapper[4975]: I0318 12:36:53.799174 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d" (UID: "1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:53 crc kubenswrapper[4975]: I0318 12:36:53.867283 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:53 crc kubenswrapper[4975]: I0318 12:36:53.867667 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:53 crc kubenswrapper[4975]: I0318 12:36:53.867742 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jr9d\" (UniqueName: \"kubernetes.io/projected/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d-kube-api-access-5jr9d\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.157151 4975 generic.go:334] "Generic (PLEG): container finished" podID="1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d" containerID="f38549335eef235d6b275539bdbfb2e1791de8d5bb1c3039dd77d107588cbd6d" exitCode=0 Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.157213 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d","Type":"ContainerDied","Data":"f38549335eef235d6b275539bdbfb2e1791de8d5bb1c3039dd77d107588cbd6d"} Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.157260 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d","Type":"ContainerDied","Data":"bc57be8c0e2afef01d315330ad15cf21b7d2ba9202238b117de4a08c9bfb23e7"} Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.157277 4975 scope.go:117] "RemoveContainer" containerID="f38549335eef235d6b275539bdbfb2e1791de8d5bb1c3039dd77d107588cbd6d" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.158644 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.185310 4975 scope.go:117] "RemoveContainer" containerID="f38549335eef235d6b275539bdbfb2e1791de8d5bb1c3039dd77d107588cbd6d" Mar 18 12:36:54 crc kubenswrapper[4975]: E0318 12:36:54.186278 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f38549335eef235d6b275539bdbfb2e1791de8d5bb1c3039dd77d107588cbd6d\": container with ID starting with f38549335eef235d6b275539bdbfb2e1791de8d5bb1c3039dd77d107588cbd6d not found: ID does not exist" containerID="f38549335eef235d6b275539bdbfb2e1791de8d5bb1c3039dd77d107588cbd6d" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.186326 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38549335eef235d6b275539bdbfb2e1791de8d5bb1c3039dd77d107588cbd6d"} err="failed to get container status \"f38549335eef235d6b275539bdbfb2e1791de8d5bb1c3039dd77d107588cbd6d\": rpc error: code = NotFound desc = could not find container \"f38549335eef235d6b275539bdbfb2e1791de8d5bb1c3039dd77d107588cbd6d\": container with ID starting with f38549335eef235d6b275539bdbfb2e1791de8d5bb1c3039dd77d107588cbd6d not found: ID does not exist" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.199257 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.220483 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.236976 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:36:54 crc kubenswrapper[4975]: E0318 12:36:54.237367 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e07ebf-56a4-49eb-a2fe-61185a762c7b" containerName="nova-manage" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.237379 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e07ebf-56a4-49eb-a2fe-61185a762c7b" containerName="nova-manage" Mar 18 12:36:54 crc kubenswrapper[4975]: E0318 12:36:54.237394 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d" containerName="nova-scheduler-scheduler" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.237401 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d" containerName="nova-scheduler-scheduler" Mar 18 12:36:54 crc kubenswrapper[4975]: E0318 12:36:54.237418 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd" containerName="dnsmasq-dns" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.237423 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd" containerName="dnsmasq-dns" Mar 18 12:36:54 crc kubenswrapper[4975]: E0318 12:36:54.237442 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd" containerName="init" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.237448 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd" containerName="init" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.237724 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d" containerName="nova-scheduler-scheduler" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.237751 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="e022d6ac-a91d-4e0c-9fd0-5a98fb9288dd" containerName="dnsmasq-dns" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.237762 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="44e07ebf-56a4-49eb-a2fe-61185a762c7b" containerName="nova-manage" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.238774 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.244483 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.248785 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.377277 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf116ab-237e-4381-849a-ce0619c3ee09-config-data\") pod \"nova-scheduler-0\" (UID: \"abf116ab-237e-4381-849a-ce0619c3ee09\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.377353 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf116ab-237e-4381-849a-ce0619c3ee09-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abf116ab-237e-4381-849a-ce0619c3ee09\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.377450 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl84d\" (UniqueName: \"kubernetes.io/projected/abf116ab-237e-4381-849a-ce0619c3ee09-kube-api-access-sl84d\") pod \"nova-scheduler-0\" (UID: \"abf116ab-237e-4381-849a-ce0619c3ee09\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.479894 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf116ab-237e-4381-849a-ce0619c3ee09-config-data\") pod \"nova-scheduler-0\" (UID: \"abf116ab-237e-4381-849a-ce0619c3ee09\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.479983 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf116ab-237e-4381-849a-ce0619c3ee09-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abf116ab-237e-4381-849a-ce0619c3ee09\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.480028 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl84d\" (UniqueName: \"kubernetes.io/projected/abf116ab-237e-4381-849a-ce0619c3ee09-kube-api-access-sl84d\") pod \"nova-scheduler-0\" (UID: \"abf116ab-237e-4381-849a-ce0619c3ee09\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.484767 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf116ab-237e-4381-849a-ce0619c3ee09-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"abf116ab-237e-4381-849a-ce0619c3ee09\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.485401 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf116ab-237e-4381-849a-ce0619c3ee09-config-data\") pod \"nova-scheduler-0\" (UID: \"abf116ab-237e-4381-849a-ce0619c3ee09\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.497577 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl84d\" (UniqueName: \"kubernetes.io/projected/abf116ab-237e-4381-849a-ce0619c3ee09-kube-api-access-sl84d\") pod \"nova-scheduler-0\" (UID: \"abf116ab-237e-4381-849a-ce0619c3ee09\") " pod="openstack/nova-scheduler-0" Mar 18 12:36:54 crc kubenswrapper[4975]: I0318 12:36:54.565717 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 12:36:55 crc kubenswrapper[4975]: W0318 12:36:55.024930 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabf116ab_237e_4381_849a_ce0619c3ee09.slice/crio-c31d6ed0f5876d1e4c947254091fdad2ce116292f8e75dff41895f7517b27b52 WatchSource:0}: Error finding container c31d6ed0f5876d1e4c947254091fdad2ce116292f8e75dff41895f7517b27b52: Status 404 returned error can't find the container with id c31d6ed0f5876d1e4c947254091fdad2ce116292f8e75dff41895f7517b27b52 Mar 18 12:36:55 crc kubenswrapper[4975]: I0318 12:36:55.031522 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d" path="/var/lib/kubelet/pods/1c9cc2f1-6af5-46fa-8d2b-9a73dd90c46d/volumes" Mar 18 12:36:55 crc kubenswrapper[4975]: I0318 12:36:55.032952 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 12:36:55 crc kubenswrapper[4975]: I0318 12:36:55.167325 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf116ab-237e-4381-849a-ce0619c3ee09","Type":"ContainerStarted","Data":"c31d6ed0f5876d1e4c947254091fdad2ce116292f8e75dff41895f7517b27b52"} Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.181832 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"abf116ab-237e-4381-849a-ce0619c3ee09","Type":"ContainerStarted","Data":"ea365e3401a876cdb8d6213d5f6e0f12bf4f402d261e89acc0c4ccff35c5a91a"} Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.216219 4975 generic.go:334] "Generic (PLEG): container finished" podID="790e3570-2044-4dea-b894-5222e6d3d2e9" containerID="06acc6bac297a959033832efd94492f6989840385401550230eda0a70307a721" exitCode=0 Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.216280 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"790e3570-2044-4dea-b894-5222e6d3d2e9","Type":"ContainerDied","Data":"06acc6bac297a959033832efd94492f6989840385401550230eda0a70307a721"} Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.221584 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.221565808 podStartE2EDuration="2.221565808s" podCreationTimestamp="2026-03-18 12:36:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:36:56.21655073 +0000 UTC m=+1601.930951319" watchObservedRunningTime="2026-03-18 12:36:56.221565808 +0000 UTC m=+1601.935966387" Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.506656 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.676733 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-config-data\") pod \"790e3570-2044-4dea-b894-5222e6d3d2e9\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.677177 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/790e3570-2044-4dea-b894-5222e6d3d2e9-logs\") pod \"790e3570-2044-4dea-b894-5222e6d3d2e9\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.677221 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-nova-metadata-tls-certs\") pod \"790e3570-2044-4dea-b894-5222e6d3d2e9\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.677255 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-combined-ca-bundle\") pod \"790e3570-2044-4dea-b894-5222e6d3d2e9\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.677288 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmczl\" (UniqueName: \"kubernetes.io/projected/790e3570-2044-4dea-b894-5222e6d3d2e9-kube-api-access-rmczl\") pod \"790e3570-2044-4dea-b894-5222e6d3d2e9\" (UID: \"790e3570-2044-4dea-b894-5222e6d3d2e9\") " Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.677521 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/790e3570-2044-4dea-b894-5222e6d3d2e9-logs" (OuterVolumeSpecName: "logs") pod "790e3570-2044-4dea-b894-5222e6d3d2e9" (UID: "790e3570-2044-4dea-b894-5222e6d3d2e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.677860 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/790e3570-2044-4dea-b894-5222e6d3d2e9-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.684313 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/790e3570-2044-4dea-b894-5222e6d3d2e9-kube-api-access-rmczl" (OuterVolumeSpecName: "kube-api-access-rmczl") pod "790e3570-2044-4dea-b894-5222e6d3d2e9" (UID: "790e3570-2044-4dea-b894-5222e6d3d2e9"). InnerVolumeSpecName "kube-api-access-rmczl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.708206 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "790e3570-2044-4dea-b894-5222e6d3d2e9" (UID: "790e3570-2044-4dea-b894-5222e6d3d2e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.722333 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-config-data" (OuterVolumeSpecName: "config-data") pod "790e3570-2044-4dea-b894-5222e6d3d2e9" (UID: "790e3570-2044-4dea-b894-5222e6d3d2e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.775137 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "790e3570-2044-4dea-b894-5222e6d3d2e9" (UID: "790e3570-2044-4dea-b894-5222e6d3d2e9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.780719 4975 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.780763 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.780776 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmczl\" (UniqueName: \"kubernetes.io/projected/790e3570-2044-4dea-b894-5222e6d3d2e9-kube-api-access-rmczl\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:56 crc kubenswrapper[4975]: I0318 12:36:56.780788 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/790e3570-2044-4dea-b894-5222e6d3d2e9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.229935 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.230614 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"790e3570-2044-4dea-b894-5222e6d3d2e9","Type":"ContainerDied","Data":"cca40968b1a608371c8dc90139e095d952c4e0f8f2008aba370817e169bb1c99"} Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.230651 4975 scope.go:117] "RemoveContainer" containerID="06acc6bac297a959033832efd94492f6989840385401550230eda0a70307a721" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.256373 4975 scope.go:117] "RemoveContainer" containerID="b3807fde9eac2ceff5338dd8ffff1452492c8ee2dff35186198ee6e05ee13616" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.262453 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.271963 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.289742 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:57 crc kubenswrapper[4975]: E0318 12:36:57.290231 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790e3570-2044-4dea-b894-5222e6d3d2e9" containerName="nova-metadata-log" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.290256 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="790e3570-2044-4dea-b894-5222e6d3d2e9" containerName="nova-metadata-log" Mar 18 12:36:57 crc kubenswrapper[4975]: E0318 12:36:57.290291 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790e3570-2044-4dea-b894-5222e6d3d2e9" containerName="nova-metadata-metadata" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.290299 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="790e3570-2044-4dea-b894-5222e6d3d2e9" containerName="nova-metadata-metadata" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.290583 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="790e3570-2044-4dea-b894-5222e6d3d2e9" containerName="nova-metadata-metadata" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.290613 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="790e3570-2044-4dea-b894-5222e6d3d2e9" containerName="nova-metadata-log" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.291704 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.294758 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.294919 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.295606 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wrqn\" (UniqueName: \"kubernetes.io/projected/764dae90-1710-4d6b-bee7-67a26f5133a7-kube-api-access-7wrqn\") pod \"nova-metadata-0\" (UID: \"764dae90-1710-4d6b-bee7-67a26f5133a7\") " pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.295659 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/764dae90-1710-4d6b-bee7-67a26f5133a7-config-data\") pod \"nova-metadata-0\" (UID: \"764dae90-1710-4d6b-bee7-67a26f5133a7\") " pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.295699 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764dae90-1710-4d6b-bee7-67a26f5133a7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"764dae90-1710-4d6b-bee7-67a26f5133a7\") " pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.295720 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/764dae90-1710-4d6b-bee7-67a26f5133a7-logs\") pod \"nova-metadata-0\" (UID: \"764dae90-1710-4d6b-bee7-67a26f5133a7\") " pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.295743 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/764dae90-1710-4d6b-bee7-67a26f5133a7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"764dae90-1710-4d6b-bee7-67a26f5133a7\") " pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.302714 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.398301 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wrqn\" (UniqueName: \"kubernetes.io/projected/764dae90-1710-4d6b-bee7-67a26f5133a7-kube-api-access-7wrqn\") pod \"nova-metadata-0\" (UID: \"764dae90-1710-4d6b-bee7-67a26f5133a7\") " pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.398600 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/764dae90-1710-4d6b-bee7-67a26f5133a7-config-data\") pod \"nova-metadata-0\" (UID: \"764dae90-1710-4d6b-bee7-67a26f5133a7\") " pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.398840 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764dae90-1710-4d6b-bee7-67a26f5133a7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"764dae90-1710-4d6b-bee7-67a26f5133a7\") " pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.398971 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/764dae90-1710-4d6b-bee7-67a26f5133a7-logs\") pod \"nova-metadata-0\" (UID: \"764dae90-1710-4d6b-bee7-67a26f5133a7\") " pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.399110 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/764dae90-1710-4d6b-bee7-67a26f5133a7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"764dae90-1710-4d6b-bee7-67a26f5133a7\") " pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.399420 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/764dae90-1710-4d6b-bee7-67a26f5133a7-logs\") pod \"nova-metadata-0\" (UID: \"764dae90-1710-4d6b-bee7-67a26f5133a7\") " pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.402368 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764dae90-1710-4d6b-bee7-67a26f5133a7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"764dae90-1710-4d6b-bee7-67a26f5133a7\") " pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.406122 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/764dae90-1710-4d6b-bee7-67a26f5133a7-config-data\") pod \"nova-metadata-0\" (UID: \"764dae90-1710-4d6b-bee7-67a26f5133a7\") " pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.411885 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/764dae90-1710-4d6b-bee7-67a26f5133a7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"764dae90-1710-4d6b-bee7-67a26f5133a7\") " pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.415835 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wrqn\" (UniqueName: \"kubernetes.io/projected/764dae90-1710-4d6b-bee7-67a26f5133a7-kube-api-access-7wrqn\") pod \"nova-metadata-0\" (UID: \"764dae90-1710-4d6b-bee7-67a26f5133a7\") " pod="openstack/nova-metadata-0" Mar 18 12:36:57 crc kubenswrapper[4975]: I0318 12:36:57.631323 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.118624 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.248715 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"764dae90-1710-4d6b-bee7-67a26f5133a7","Type":"ContainerStarted","Data":"b5996eaa3fc8753ebc0dd9142637e49b4788762d9e7bffc6991b30cdf0f1a731"} Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.251091 4975 generic.go:334] "Generic (PLEG): container finished" podID="37a08fee-c205-47da-bc62-127fec3be9b7" containerID="ca516b3155c9da471d057489d115c2e6bef88ebe8795fcf12023d49f4b4ba354" exitCode=0 Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.251147 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37a08fee-c205-47da-bc62-127fec3be9b7","Type":"ContainerDied","Data":"ca516b3155c9da471d057489d115c2e6bef88ebe8795fcf12023d49f4b4ba354"} Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.251176 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37a08fee-c205-47da-bc62-127fec3be9b7","Type":"ContainerDied","Data":"df6b5ae5e2753158e5656c4539ee17d5df45287376e8f10171ae12546d745187"} Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.251187 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df6b5ae5e2753158e5656c4539ee17d5df45287376e8f10171ae12546d745187" Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.353136 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.517903 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q94zb\" (UniqueName: \"kubernetes.io/projected/37a08fee-c205-47da-bc62-127fec3be9b7-kube-api-access-q94zb\") pod \"37a08fee-c205-47da-bc62-127fec3be9b7\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.518237 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-combined-ca-bundle\") pod \"37a08fee-c205-47da-bc62-127fec3be9b7\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.518300 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a08fee-c205-47da-bc62-127fec3be9b7-logs\") pod \"37a08fee-c205-47da-bc62-127fec3be9b7\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.518360 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-config-data\") pod \"37a08fee-c205-47da-bc62-127fec3be9b7\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.518376 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-public-tls-certs\") pod \"37a08fee-c205-47da-bc62-127fec3be9b7\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.518406 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-internal-tls-certs\") pod \"37a08fee-c205-47da-bc62-127fec3be9b7\" (UID: \"37a08fee-c205-47da-bc62-127fec3be9b7\") " Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.519446 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37a08fee-c205-47da-bc62-127fec3be9b7-logs" (OuterVolumeSpecName: "logs") pod "37a08fee-c205-47da-bc62-127fec3be9b7" (UID: "37a08fee-c205-47da-bc62-127fec3be9b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.522771 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a08fee-c205-47da-bc62-127fec3be9b7-kube-api-access-q94zb" (OuterVolumeSpecName: "kube-api-access-q94zb") pod "37a08fee-c205-47da-bc62-127fec3be9b7" (UID: "37a08fee-c205-47da-bc62-127fec3be9b7"). InnerVolumeSpecName "kube-api-access-q94zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.543310 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-config-data" (OuterVolumeSpecName: "config-data") pod "37a08fee-c205-47da-bc62-127fec3be9b7" (UID: "37a08fee-c205-47da-bc62-127fec3be9b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.544631 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37a08fee-c205-47da-bc62-127fec3be9b7" (UID: "37a08fee-c205-47da-bc62-127fec3be9b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.583046 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "37a08fee-c205-47da-bc62-127fec3be9b7" (UID: "37a08fee-c205-47da-bc62-127fec3be9b7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.594584 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "37a08fee-c205-47da-bc62-127fec3be9b7" (UID: "37a08fee-c205-47da-bc62-127fec3be9b7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.620292 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q94zb\" (UniqueName: \"kubernetes.io/projected/37a08fee-c205-47da-bc62-127fec3be9b7-kube-api-access-q94zb\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.620332 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.620344 4975 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a08fee-c205-47da-bc62-127fec3be9b7-logs\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.620356 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.620368 4975 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:58 crc kubenswrapper[4975]: I0318 12:36:58.620378 4975 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a08fee-c205-47da-bc62-127fec3be9b7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.027701 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="790e3570-2044-4dea-b894-5222e6d3d2e9" path="/var/lib/kubelet/pods/790e3570-2044-4dea-b894-5222e6d3d2e9/volumes" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.273203 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.274745 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"764dae90-1710-4d6b-bee7-67a26f5133a7","Type":"ContainerStarted","Data":"ef915acff4ed0e7259b89ced23f7076ec48e9f72da5015d74193c9171544d3fd"} Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.275338 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"764dae90-1710-4d6b-bee7-67a26f5133a7","Type":"ContainerStarted","Data":"33a42712eebeb7bd93501bdebb1f783b409f7ddcfd2414774d238921840ebe28"} Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.300108 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.300084577 podStartE2EDuration="2.300084577s" podCreationTimestamp="2026-03-18 12:36:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:36:59.294269156 +0000 UTC m=+1605.008669735" watchObservedRunningTime="2026-03-18 12:36:59.300084577 +0000 UTC m=+1605.014485156" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.317269 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.339267 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.349734 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:59 crc kubenswrapper[4975]: E0318 12:36:59.350192 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a08fee-c205-47da-bc62-127fec3be9b7" containerName="nova-api-api" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.350209 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a08fee-c205-47da-bc62-127fec3be9b7" containerName="nova-api-api" Mar 18 12:36:59 crc kubenswrapper[4975]: E0318 12:36:59.350238 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a08fee-c205-47da-bc62-127fec3be9b7" containerName="nova-api-log" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.350244 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a08fee-c205-47da-bc62-127fec3be9b7" containerName="nova-api-log" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.350403 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a08fee-c205-47da-bc62-127fec3be9b7" containerName="nova-api-log" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.350424 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a08fee-c205-47da-bc62-127fec3be9b7" containerName="nova-api-api" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.351403 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.355293 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.355576 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.355683 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.409903 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.437181 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84723d07-c2d7-457f-8627-3420d0a1d3ae-config-data\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.437275 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84723d07-c2d7-457f-8627-3420d0a1d3ae-logs\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.437327 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jglmw\" (UniqueName: \"kubernetes.io/projected/84723d07-c2d7-457f-8627-3420d0a1d3ae-kube-api-access-jglmw\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.437359 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84723d07-c2d7-457f-8627-3420d0a1d3ae-internal-tls-certs\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.437389 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84723d07-c2d7-457f-8627-3420d0a1d3ae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.437421 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84723d07-c2d7-457f-8627-3420d0a1d3ae-public-tls-certs\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.538982 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84723d07-c2d7-457f-8627-3420d0a1d3ae-logs\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.539074 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jglmw\" (UniqueName: \"kubernetes.io/projected/84723d07-c2d7-457f-8627-3420d0a1d3ae-kube-api-access-jglmw\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.539114 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84723d07-c2d7-457f-8627-3420d0a1d3ae-internal-tls-certs\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.539149 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84723d07-c2d7-457f-8627-3420d0a1d3ae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.539186 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84723d07-c2d7-457f-8627-3420d0a1d3ae-public-tls-certs\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.539272 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84723d07-c2d7-457f-8627-3420d0a1d3ae-config-data\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.539390 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84723d07-c2d7-457f-8627-3420d0a1d3ae-logs\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.548684 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84723d07-c2d7-457f-8627-3420d0a1d3ae-internal-tls-certs\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.548705 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84723d07-c2d7-457f-8627-3420d0a1d3ae-public-tls-certs\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.549273 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84723d07-c2d7-457f-8627-3420d0a1d3ae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.550563 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84723d07-c2d7-457f-8627-3420d0a1d3ae-config-data\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.554403 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jglmw\" (UniqueName: \"kubernetes.io/projected/84723d07-c2d7-457f-8627-3420d0a1d3ae-kube-api-access-jglmw\") pod \"nova-api-0\" (UID: \"84723d07-c2d7-457f-8627-3420d0a1d3ae\") " pod="openstack/nova-api-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.566200 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 12:36:59 crc kubenswrapper[4975]: I0318 12:36:59.732847 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 12:37:00 crc kubenswrapper[4975]: W0318 12:37:00.178454 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84723d07_c2d7_457f_8627_3420d0a1d3ae.slice/crio-8ceef02f1b9d1d8cb74f40bb958aa784466471f84e2c8ae34a4df66b8fb39bfd WatchSource:0}: Error finding container 8ceef02f1b9d1d8cb74f40bb958aa784466471f84e2c8ae34a4df66b8fb39bfd: Status 404 returned error can't find the container with id 8ceef02f1b9d1d8cb74f40bb958aa784466471f84e2c8ae34a4df66b8fb39bfd Mar 18 12:37:00 crc kubenswrapper[4975]: I0318 12:37:00.181702 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 12:37:00 crc kubenswrapper[4975]: I0318 12:37:00.386343 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fd4tz"] Mar 18 12:37:00 crc kubenswrapper[4975]: I0318 12:37:00.389510 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84723d07-c2d7-457f-8627-3420d0a1d3ae","Type":"ContainerStarted","Data":"8ceef02f1b9d1d8cb74f40bb958aa784466471f84e2c8ae34a4df66b8fb39bfd"} Mar 18 12:37:00 crc kubenswrapper[4975]: I0318 12:37:00.389637 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:00 crc kubenswrapper[4975]: I0318 12:37:00.423624 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fd4tz"] Mar 18 12:37:00 crc kubenswrapper[4975]: I0318 12:37:00.459499 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7psn6\" (UniqueName: \"kubernetes.io/projected/dbf23fd2-a15c-4793-8c35-e6a15826b510-kube-api-access-7psn6\") pod \"community-operators-fd4tz\" (UID: \"dbf23fd2-a15c-4793-8c35-e6a15826b510\") " pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:00 crc kubenswrapper[4975]: I0318 12:37:00.459567 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf23fd2-a15c-4793-8c35-e6a15826b510-utilities\") pod \"community-operators-fd4tz\" (UID: \"dbf23fd2-a15c-4793-8c35-e6a15826b510\") " pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:00 crc kubenswrapper[4975]: I0318 12:37:00.459597 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf23fd2-a15c-4793-8c35-e6a15826b510-catalog-content\") pod \"community-operators-fd4tz\" (UID: \"dbf23fd2-a15c-4793-8c35-e6a15826b510\") " pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:00 crc kubenswrapper[4975]: I0318 12:37:00.561244 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7psn6\" (UniqueName: \"kubernetes.io/projected/dbf23fd2-a15c-4793-8c35-e6a15826b510-kube-api-access-7psn6\") pod \"community-operators-fd4tz\" (UID: \"dbf23fd2-a15c-4793-8c35-e6a15826b510\") " pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:00 crc kubenswrapper[4975]: I0318 12:37:00.561335 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf23fd2-a15c-4793-8c35-e6a15826b510-utilities\") pod \"community-operators-fd4tz\" (UID: \"dbf23fd2-a15c-4793-8c35-e6a15826b510\") " pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:00 crc kubenswrapper[4975]: I0318 12:37:00.561363 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf23fd2-a15c-4793-8c35-e6a15826b510-catalog-content\") pod \"community-operators-fd4tz\" (UID: \"dbf23fd2-a15c-4793-8c35-e6a15826b510\") " pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:00 crc kubenswrapper[4975]: I0318 12:37:00.561808 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf23fd2-a15c-4793-8c35-e6a15826b510-catalog-content\") pod \"community-operators-fd4tz\" (UID: \"dbf23fd2-a15c-4793-8c35-e6a15826b510\") " pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:00 crc kubenswrapper[4975]: I0318 12:37:00.562052 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf23fd2-a15c-4793-8c35-e6a15826b510-utilities\") pod \"community-operators-fd4tz\" (UID: \"dbf23fd2-a15c-4793-8c35-e6a15826b510\") " pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:00 crc kubenswrapper[4975]: I0318 12:37:00.579703 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7psn6\" (UniqueName: \"kubernetes.io/projected/dbf23fd2-a15c-4793-8c35-e6a15826b510-kube-api-access-7psn6\") pod \"community-operators-fd4tz\" (UID: \"dbf23fd2-a15c-4793-8c35-e6a15826b510\") " pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:00 crc kubenswrapper[4975]: I0318 12:37:00.708751 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:01 crc kubenswrapper[4975]: I0318 12:37:01.029261 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a08fee-c205-47da-bc62-127fec3be9b7" path="/var/lib/kubelet/pods/37a08fee-c205-47da-bc62-127fec3be9b7/volumes" Mar 18 12:37:01 crc kubenswrapper[4975]: I0318 12:37:01.203026 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fd4tz"] Mar 18 12:37:01 crc kubenswrapper[4975]: I0318 12:37:01.414540 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd4tz" event={"ID":"dbf23fd2-a15c-4793-8c35-e6a15826b510","Type":"ContainerStarted","Data":"053c65210389cae104b4c88362c52ff7ebe1e205cefe503282492e3cee0a0072"} Mar 18 12:37:01 crc kubenswrapper[4975]: I0318 12:37:01.425558 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84723d07-c2d7-457f-8627-3420d0a1d3ae","Type":"ContainerStarted","Data":"3ad1c9e56a730774da326c3689c2a115e0b68a41fafb762737022a9bbd2ffccb"} Mar 18 12:37:01 crc kubenswrapper[4975]: I0318 12:37:01.425807 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84723d07-c2d7-457f-8627-3420d0a1d3ae","Type":"ContainerStarted","Data":"2cd03459ff1857919333a57e6f4a6e9d7c186d39916aadab9086ffd032d2c045"} Mar 18 12:37:02 crc kubenswrapper[4975]: I0318 12:37:02.435683 4975 generic.go:334] "Generic (PLEG): container finished" podID="dbf23fd2-a15c-4793-8c35-e6a15826b510" containerID="61a8acbc0949166032112fa538ec1505506948ae17b8bed11f1924cc8627bcda" exitCode=0 Mar 18 12:37:02 crc kubenswrapper[4975]: I0318 12:37:02.435785 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd4tz" event={"ID":"dbf23fd2-a15c-4793-8c35-e6a15826b510","Type":"ContainerDied","Data":"61a8acbc0949166032112fa538ec1505506948ae17b8bed11f1924cc8627bcda"} Mar 18 12:37:02 crc kubenswrapper[4975]: I0318 12:37:02.456193 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.456160671 podStartE2EDuration="3.456160671s" podCreationTimestamp="2026-03-18 12:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:37:01.444671953 +0000 UTC m=+1607.159072532" watchObservedRunningTime="2026-03-18 12:37:02.456160671 +0000 UTC m=+1608.170561250" Mar 18 12:37:03 crc kubenswrapper[4975]: I0318 12:37:03.449573 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd4tz" event={"ID":"dbf23fd2-a15c-4793-8c35-e6a15826b510","Type":"ContainerStarted","Data":"bfd212b468d0ef82dd8375d26be880db68e7902b2bf53887b55dedc8f9e5e174"} Mar 18 12:37:04 crc kubenswrapper[4975]: I0318 12:37:04.462297 4975 generic.go:334] "Generic (PLEG): container finished" podID="dbf23fd2-a15c-4793-8c35-e6a15826b510" containerID="bfd212b468d0ef82dd8375d26be880db68e7902b2bf53887b55dedc8f9e5e174" exitCode=0 Mar 18 12:37:04 crc kubenswrapper[4975]: I0318 12:37:04.462479 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd4tz" event={"ID":"dbf23fd2-a15c-4793-8c35-e6a15826b510","Type":"ContainerDied","Data":"bfd212b468d0ef82dd8375d26be880db68e7902b2bf53887b55dedc8f9e5e174"} Mar 18 12:37:04 crc kubenswrapper[4975]: I0318 12:37:04.566227 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 12:37:04 crc kubenswrapper[4975]: I0318 12:37:04.591190 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 12:37:05 crc kubenswrapper[4975]: I0318 12:37:05.477466 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd4tz" event={"ID":"dbf23fd2-a15c-4793-8c35-e6a15826b510","Type":"ContainerStarted","Data":"d6b1d4329bb3c539cfd0d292697e8ee51c14b4013fd01b2f8e4e7e066992dfbd"} Mar 18 12:37:05 crc kubenswrapper[4975]: I0318 12:37:05.499914 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fd4tz" podStartSLOduration=2.882794842 podStartE2EDuration="5.499880372s" podCreationTimestamp="2026-03-18 12:37:00 +0000 UTC" firstStartedPulling="2026-03-18 12:37:02.437577279 +0000 UTC m=+1608.151977858" lastFinishedPulling="2026-03-18 12:37:05.054662809 +0000 UTC m=+1610.769063388" observedRunningTime="2026-03-18 12:37:05.495777208 +0000 UTC m=+1611.210177787" watchObservedRunningTime="2026-03-18 12:37:05.499880372 +0000 UTC m=+1611.214280951" Mar 18 12:37:05 crc kubenswrapper[4975]: I0318 12:37:05.507514 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 12:37:06 crc kubenswrapper[4975]: I0318 12:37:06.017037 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:37:06 crc kubenswrapper[4975]: E0318 12:37:06.017313 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:37:07 crc kubenswrapper[4975]: I0318 12:37:07.633238 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 12:37:07 crc kubenswrapper[4975]: I0318 12:37:07.633606 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 12:37:08 crc kubenswrapper[4975]: I0318 12:37:08.367370 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 12:37:08 crc kubenswrapper[4975]: I0318 12:37:08.645090 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="764dae90-1710-4d6b-bee7-67a26f5133a7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:37:08 crc kubenswrapper[4975]: I0318 12:37:08.645092 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="764dae90-1710-4d6b-bee7-67a26f5133a7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:37:09 crc kubenswrapper[4975]: I0318 12:37:09.733597 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:37:09 crc kubenswrapper[4975]: I0318 12:37:09.733907 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 12:37:10 crc kubenswrapper[4975]: I0318 12:37:10.709574 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:10 crc kubenswrapper[4975]: I0318 12:37:10.710135 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:10 crc kubenswrapper[4975]: I0318 12:37:10.747128 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="84723d07-c2d7-457f-8627-3420d0a1d3ae" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:37:10 crc kubenswrapper[4975]: I0318 12:37:10.747149 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="84723d07-c2d7-457f-8627-3420d0a1d3ae" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 12:37:10 crc kubenswrapper[4975]: I0318 12:37:10.765166 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:11 crc kubenswrapper[4975]: I0318 12:37:11.607555 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:11 crc kubenswrapper[4975]: I0318 12:37:11.667362 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fd4tz"] Mar 18 12:37:13 crc kubenswrapper[4975]: I0318 12:37:13.546995 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fd4tz" podUID="dbf23fd2-a15c-4793-8c35-e6a15826b510" containerName="registry-server" containerID="cri-o://d6b1d4329bb3c539cfd0d292697e8ee51c14b4013fd01b2f8e4e7e066992dfbd" gracePeriod=2 Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.016419 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.028391 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7psn6\" (UniqueName: \"kubernetes.io/projected/dbf23fd2-a15c-4793-8c35-e6a15826b510-kube-api-access-7psn6\") pod \"dbf23fd2-a15c-4793-8c35-e6a15826b510\" (UID: \"dbf23fd2-a15c-4793-8c35-e6a15826b510\") " Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.028525 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf23fd2-a15c-4793-8c35-e6a15826b510-utilities\") pod \"dbf23fd2-a15c-4793-8c35-e6a15826b510\" (UID: \"dbf23fd2-a15c-4793-8c35-e6a15826b510\") " Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.028585 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf23fd2-a15c-4793-8c35-e6a15826b510-catalog-content\") pod \"dbf23fd2-a15c-4793-8c35-e6a15826b510\" (UID: \"dbf23fd2-a15c-4793-8c35-e6a15826b510\") " Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.029010 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf23fd2-a15c-4793-8c35-e6a15826b510-utilities" (OuterVolumeSpecName: "utilities") pod "dbf23fd2-a15c-4793-8c35-e6a15826b510" (UID: "dbf23fd2-a15c-4793-8c35-e6a15826b510"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.029313 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf23fd2-a15c-4793-8c35-e6a15826b510-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.070431 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf23fd2-a15c-4793-8c35-e6a15826b510-kube-api-access-7psn6" (OuterVolumeSpecName: "kube-api-access-7psn6") pod "dbf23fd2-a15c-4793-8c35-e6a15826b510" (UID: "dbf23fd2-a15c-4793-8c35-e6a15826b510"). InnerVolumeSpecName "kube-api-access-7psn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.075907 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf23fd2-a15c-4793-8c35-e6a15826b510-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbf23fd2-a15c-4793-8c35-e6a15826b510" (UID: "dbf23fd2-a15c-4793-8c35-e6a15826b510"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.130993 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7psn6\" (UniqueName: \"kubernetes.io/projected/dbf23fd2-a15c-4793-8c35-e6a15826b510-kube-api-access-7psn6\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.131072 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf23fd2-a15c-4793-8c35-e6a15826b510-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.561413 4975 generic.go:334] "Generic (PLEG): container finished" podID="dbf23fd2-a15c-4793-8c35-e6a15826b510" containerID="d6b1d4329bb3c539cfd0d292697e8ee51c14b4013fd01b2f8e4e7e066992dfbd" exitCode=0 Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.561476 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd4tz" event={"ID":"dbf23fd2-a15c-4793-8c35-e6a15826b510","Type":"ContainerDied","Data":"d6b1d4329bb3c539cfd0d292697e8ee51c14b4013fd01b2f8e4e7e066992dfbd"} Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.561658 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd4tz" event={"ID":"dbf23fd2-a15c-4793-8c35-e6a15826b510","Type":"ContainerDied","Data":"053c65210389cae104b4c88362c52ff7ebe1e205cefe503282492e3cee0a0072"} Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.561681 4975 scope.go:117] "RemoveContainer" containerID="d6b1d4329bb3c539cfd0d292697e8ee51c14b4013fd01b2f8e4e7e066992dfbd" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.561571 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd4tz" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.595862 4975 scope.go:117] "RemoveContainer" containerID="bfd212b468d0ef82dd8375d26be880db68e7902b2bf53887b55dedc8f9e5e174" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.634913 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fd4tz"] Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.636555 4975 scope.go:117] "RemoveContainer" containerID="61a8acbc0949166032112fa538ec1505506948ae17b8bed11f1924cc8627bcda" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.648359 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fd4tz"] Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.706324 4975 scope.go:117] "RemoveContainer" containerID="d6b1d4329bb3c539cfd0d292697e8ee51c14b4013fd01b2f8e4e7e066992dfbd" Mar 18 12:37:14 crc kubenswrapper[4975]: E0318 12:37:14.707003 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b1d4329bb3c539cfd0d292697e8ee51c14b4013fd01b2f8e4e7e066992dfbd\": container with ID starting with d6b1d4329bb3c539cfd0d292697e8ee51c14b4013fd01b2f8e4e7e066992dfbd not found: ID does not exist" containerID="d6b1d4329bb3c539cfd0d292697e8ee51c14b4013fd01b2f8e4e7e066992dfbd" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.707052 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b1d4329bb3c539cfd0d292697e8ee51c14b4013fd01b2f8e4e7e066992dfbd"} err="failed to get container status \"d6b1d4329bb3c539cfd0d292697e8ee51c14b4013fd01b2f8e4e7e066992dfbd\": rpc error: code = NotFound desc = could not find container \"d6b1d4329bb3c539cfd0d292697e8ee51c14b4013fd01b2f8e4e7e066992dfbd\": container with ID starting with d6b1d4329bb3c539cfd0d292697e8ee51c14b4013fd01b2f8e4e7e066992dfbd not found: ID does not exist" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.707080 4975 scope.go:117] "RemoveContainer" containerID="bfd212b468d0ef82dd8375d26be880db68e7902b2bf53887b55dedc8f9e5e174" Mar 18 12:37:14 crc kubenswrapper[4975]: E0318 12:37:14.707560 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd212b468d0ef82dd8375d26be880db68e7902b2bf53887b55dedc8f9e5e174\": container with ID starting with bfd212b468d0ef82dd8375d26be880db68e7902b2bf53887b55dedc8f9e5e174 not found: ID does not exist" containerID="bfd212b468d0ef82dd8375d26be880db68e7902b2bf53887b55dedc8f9e5e174" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.707627 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd212b468d0ef82dd8375d26be880db68e7902b2bf53887b55dedc8f9e5e174"} err="failed to get container status \"bfd212b468d0ef82dd8375d26be880db68e7902b2bf53887b55dedc8f9e5e174\": rpc error: code = NotFound desc = could not find container \"bfd212b468d0ef82dd8375d26be880db68e7902b2bf53887b55dedc8f9e5e174\": container with ID starting with bfd212b468d0ef82dd8375d26be880db68e7902b2bf53887b55dedc8f9e5e174 not found: ID does not exist" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.707670 4975 scope.go:117] "RemoveContainer" containerID="61a8acbc0949166032112fa538ec1505506948ae17b8bed11f1924cc8627bcda" Mar 18 12:37:14 crc kubenswrapper[4975]: E0318 12:37:14.708021 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a8acbc0949166032112fa538ec1505506948ae17b8bed11f1924cc8627bcda\": container with ID starting with 61a8acbc0949166032112fa538ec1505506948ae17b8bed11f1924cc8627bcda not found: ID does not exist" containerID="61a8acbc0949166032112fa538ec1505506948ae17b8bed11f1924cc8627bcda" Mar 18 12:37:14 crc kubenswrapper[4975]: I0318 12:37:14.708057 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a8acbc0949166032112fa538ec1505506948ae17b8bed11f1924cc8627bcda"} err="failed to get container status \"61a8acbc0949166032112fa538ec1505506948ae17b8bed11f1924cc8627bcda\": rpc error: code = NotFound desc = could not find container \"61a8acbc0949166032112fa538ec1505506948ae17b8bed11f1924cc8627bcda\": container with ID starting with 61a8acbc0949166032112fa538ec1505506948ae17b8bed11f1924cc8627bcda not found: ID does not exist" Mar 18 12:37:15 crc kubenswrapper[4975]: I0318 12:37:15.035216 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf23fd2-a15c-4793-8c35-e6a15826b510" path="/var/lib/kubelet/pods/dbf23fd2-a15c-4793-8c35-e6a15826b510/volumes" Mar 18 12:37:15 crc kubenswrapper[4975]: I0318 12:37:15.632387 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 12:37:15 crc kubenswrapper[4975]: I0318 12:37:15.632564 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 12:37:17 crc kubenswrapper[4975]: I0318 12:37:17.639835 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 12:37:17 crc kubenswrapper[4975]: I0318 12:37:17.642933 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 12:37:17 crc kubenswrapper[4975]: I0318 12:37:17.648326 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 12:37:17 crc kubenswrapper[4975]: I0318 12:37:17.732946 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 12:37:17 crc kubenswrapper[4975]: I0318 12:37:17.733366 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 12:37:18 crc kubenswrapper[4975]: I0318 12:37:18.604203 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 12:37:19 crc kubenswrapper[4975]: I0318 12:37:19.016521 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:37:19 crc kubenswrapper[4975]: E0318 12:37:19.016797 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:37:19 crc kubenswrapper[4975]: I0318 12:37:19.739628 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 12:37:19 crc kubenswrapper[4975]: I0318 12:37:19.741946 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 12:37:19 crc kubenswrapper[4975]: I0318 12:37:19.749218 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 12:37:20 crc kubenswrapper[4975]: I0318 12:37:20.623686 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 12:37:29 crc kubenswrapper[4975]: I0318 12:37:29.556772 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:37:31 crc kubenswrapper[4975]: I0318 12:37:31.016808 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:37:31 crc kubenswrapper[4975]: E0318 12:37:31.017425 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:37:31 crc kubenswrapper[4975]: I0318 12:37:31.173602 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:37:33 crc kubenswrapper[4975]: I0318 12:37:33.884117 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" containerName="rabbitmq" containerID="cri-o://5e61d8a65d5b622b8755f020d3edfd8731deb2b2bcf19a4b6760afd8bc529b60" gracePeriod=604796 Mar 18 12:37:35 crc kubenswrapper[4975]: I0318 12:37:35.310662 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a68f98b5-0226-4b20-a767-ead5e0af066e" containerName="rabbitmq" containerID="cri-o://5f6d0b95c66b9613d5726ca8079d04f6ce79946cd7d41024b803239fe09b32a1" gracePeriod=604796 Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.335239 4975 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a68f98b5-0226-4b20-a767-ead5e0af066e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.522834 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.685082 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-erlang-cookie-secret\") pod \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.685465 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-confd\") pod \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.685603 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.685706 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-server-conf\") pod \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.685826 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-plugins\") pod \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.685974 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w8tw\" (UniqueName: \"kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-kube-api-access-7w8tw\") pod \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.686075 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-plugins-conf\") pod \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.686221 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-erlang-cookie\") pod \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.686341 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-config-data\") pod \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.686443 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-pod-info\") pod \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.686543 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-tls\") pod \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\" (UID: \"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5\") " Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.686485 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" (UID: "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.686748 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" (UID: "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.686903 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" (UID: "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.688303 4975 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.688336 4975 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.688350 4975 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.691830 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" (UID: "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.691884 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-pod-info" (OuterVolumeSpecName: "pod-info") pod "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" (UID: "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.691988 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" (UID: "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.696070 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-kube-api-access-7w8tw" (OuterVolumeSpecName: "kube-api-access-7w8tw") pod "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" (UID: "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5"). InnerVolumeSpecName "kube-api-access-7w8tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.696129 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" (UID: "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.715767 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-config-data" (OuterVolumeSpecName: "config-data") pod "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" (UID: "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.747516 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-server-conf" (OuterVolumeSpecName: "server-conf") pod "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" (UID: "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.789508 4975 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.789537 4975 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.789558 4975 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.789568 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w8tw\" (UniqueName: \"kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-kube-api-access-7w8tw\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.789578 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.789588 4975 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.789600 4975 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.804626 4975 generic.go:334] "Generic (PLEG): container finished" podID="c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" containerID="5e61d8a65d5b622b8755f020d3edfd8731deb2b2bcf19a4b6760afd8bc529b60" exitCode=0 Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.804673 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5","Type":"ContainerDied","Data":"5e61d8a65d5b622b8755f020d3edfd8731deb2b2bcf19a4b6760afd8bc529b60"} Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.804703 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c5f80aec-91ca-4aec-91b0-8e26f87ef0c5","Type":"ContainerDied","Data":"5e5f7b4ae1b56fb6fd09f9d532cbfe95e79fa5c50ac23f84a724403ed2e0d0fe"} Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.804719 4975 scope.go:117] "RemoveContainer" containerID="5e61d8a65d5b622b8755f020d3edfd8731deb2b2bcf19a4b6760afd8bc529b60" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.804877 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.811010 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" (UID: "c5f80aec-91ca-4aec-91b0-8e26f87ef0c5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.813215 4975 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.841849 4975 scope.go:117] "RemoveContainer" containerID="593d8e21a04405a96f7601c6d9e04bc7ca7920d74e754470dec6f28b650c85f9" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.864763 4975 scope.go:117] "RemoveContainer" containerID="5e61d8a65d5b622b8755f020d3edfd8731deb2b2bcf19a4b6760afd8bc529b60" Mar 18 12:37:40 crc kubenswrapper[4975]: E0318 12:37:40.865374 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e61d8a65d5b622b8755f020d3edfd8731deb2b2bcf19a4b6760afd8bc529b60\": container with ID starting with 5e61d8a65d5b622b8755f020d3edfd8731deb2b2bcf19a4b6760afd8bc529b60 not found: ID does not exist" containerID="5e61d8a65d5b622b8755f020d3edfd8731deb2b2bcf19a4b6760afd8bc529b60" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.865427 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e61d8a65d5b622b8755f020d3edfd8731deb2b2bcf19a4b6760afd8bc529b60"} err="failed to get container status \"5e61d8a65d5b622b8755f020d3edfd8731deb2b2bcf19a4b6760afd8bc529b60\": rpc error: code = NotFound desc = could not find container \"5e61d8a65d5b622b8755f020d3edfd8731deb2b2bcf19a4b6760afd8bc529b60\": container with ID starting with 5e61d8a65d5b622b8755f020d3edfd8731deb2b2bcf19a4b6760afd8bc529b60 not found: ID does not exist" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.865457 4975 scope.go:117] "RemoveContainer" containerID="593d8e21a04405a96f7601c6d9e04bc7ca7920d74e754470dec6f28b650c85f9" Mar 18 12:37:40 crc kubenswrapper[4975]: E0318 12:37:40.865781 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"593d8e21a04405a96f7601c6d9e04bc7ca7920d74e754470dec6f28b650c85f9\": container with ID starting with 593d8e21a04405a96f7601c6d9e04bc7ca7920d74e754470dec6f28b650c85f9 not found: ID does not exist" containerID="593d8e21a04405a96f7601c6d9e04bc7ca7920d74e754470dec6f28b650c85f9" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.865808 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"593d8e21a04405a96f7601c6d9e04bc7ca7920d74e754470dec6f28b650c85f9"} err="failed to get container status \"593d8e21a04405a96f7601c6d9e04bc7ca7920d74e754470dec6f28b650c85f9\": rpc error: code = NotFound desc = could not find container \"593d8e21a04405a96f7601c6d9e04bc7ca7920d74e754470dec6f28b650c85f9\": container with ID starting with 593d8e21a04405a96f7601c6d9e04bc7ca7920d74e754470dec6f28b650c85f9 not found: ID does not exist" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.891267 4975 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:40 crc kubenswrapper[4975]: I0318 12:37:40.891300 4975 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.134667 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.147192 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.156793 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:37:41 crc kubenswrapper[4975]: E0318 12:37:41.157279 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf23fd2-a15c-4793-8c35-e6a15826b510" containerName="registry-server" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.159526 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf23fd2-a15c-4793-8c35-e6a15826b510" containerName="registry-server" Mar 18 12:37:41 crc kubenswrapper[4975]: E0318 12:37:41.159546 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" containerName="setup-container" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.159554 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" containerName="setup-container" Mar 18 12:37:41 crc kubenswrapper[4975]: E0318 12:37:41.159575 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" containerName="rabbitmq" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.159582 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" containerName="rabbitmq" Mar 18 12:37:41 crc kubenswrapper[4975]: E0318 12:37:41.159592 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf23fd2-a15c-4793-8c35-e6a15826b510" containerName="extract-utilities" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.159599 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf23fd2-a15c-4793-8c35-e6a15826b510" containerName="extract-utilities" Mar 18 12:37:41 crc kubenswrapper[4975]: E0318 12:37:41.159608 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf23fd2-a15c-4793-8c35-e6a15826b510" containerName="extract-content" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.159614 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf23fd2-a15c-4793-8c35-e6a15826b510" containerName="extract-content" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.159826 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf23fd2-a15c-4793-8c35-e6a15826b510" containerName="registry-server" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.159848 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" containerName="rabbitmq" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.160816 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.167606 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.167674 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.167608 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.168081 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.168401 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.168512 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.168634 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lmpq8" Mar 18 12:37:41 crc kubenswrapper[4975]: E0318 12:37:41.180462 4975 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5f80aec_91ca_4aec_91b0_8e26f87ef0c5.slice/crio-5e5f7b4ae1b56fb6fd09f9d532cbfe95e79fa5c50ac23f84a724403ed2e0d0fe\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5f80aec_91ca_4aec_91b0_8e26f87ef0c5.slice\": RecentStats: unable to find data in memory cache]" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.192627 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.296330 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.296624 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.296658 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp8nh\" (UniqueName: \"kubernetes.io/projected/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-kube-api-access-qp8nh\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.296685 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.296712 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.296740 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.296756 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.296800 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.296823 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-config-data\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.296838 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.296857 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.398484 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.398534 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-config-data\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.398556 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.398579 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.398660 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.398676 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.398700 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp8nh\" (UniqueName: \"kubernetes.io/projected/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-kube-api-access-qp8nh\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.398723 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.398744 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.398770 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.398786 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.399257 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.399522 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.399534 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-config-data\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.400080 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.400428 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.400688 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.403895 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.403940 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.411689 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.415196 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.421657 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp8nh\" (UniqueName: \"kubernetes.io/projected/df91a8b9-ed19-4f64-9d3a-2c93bae6916a-kube-api-access-qp8nh\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.441386 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"df91a8b9-ed19-4f64-9d3a-2c93bae6916a\") " pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.515860 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.829369 4975 generic.go:334] "Generic (PLEG): container finished" podID="a68f98b5-0226-4b20-a767-ead5e0af066e" containerID="5f6d0b95c66b9613d5726ca8079d04f6ce79946cd7d41024b803239fe09b32a1" exitCode=0 Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.829468 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a68f98b5-0226-4b20-a767-ead5e0af066e","Type":"ContainerDied","Data":"5f6d0b95c66b9613d5726ca8079d04f6ce79946cd7d41024b803239fe09b32a1"} Mar 18 12:37:41 crc kubenswrapper[4975]: I0318 12:37:41.911914 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.011117 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-server-conf\") pod \"a68f98b5-0226-4b20-a767-ead5e0af066e\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.011185 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a68f98b5-0226-4b20-a767-ead5e0af066e-erlang-cookie-secret\") pod \"a68f98b5-0226-4b20-a767-ead5e0af066e\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.011326 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"a68f98b5-0226-4b20-a767-ead5e0af066e\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.011354 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-tls\") pod \"a68f98b5-0226-4b20-a767-ead5e0af066e\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.011377 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-plugins-conf\") pod \"a68f98b5-0226-4b20-a767-ead5e0af066e\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.011397 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-plugins\") pod \"a68f98b5-0226-4b20-a767-ead5e0af066e\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.011471 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-confd\") pod \"a68f98b5-0226-4b20-a767-ead5e0af066e\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.011495 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjsnm\" (UniqueName: \"kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-kube-api-access-jjsnm\") pod \"a68f98b5-0226-4b20-a767-ead5e0af066e\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.011539 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-erlang-cookie\") pod \"a68f98b5-0226-4b20-a767-ead5e0af066e\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.011574 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a68f98b5-0226-4b20-a767-ead5e0af066e-pod-info\") pod \"a68f98b5-0226-4b20-a767-ead5e0af066e\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.011602 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-config-data\") pod \"a68f98b5-0226-4b20-a767-ead5e0af066e\" (UID: \"a68f98b5-0226-4b20-a767-ead5e0af066e\") " Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.012252 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a68f98b5-0226-4b20-a767-ead5e0af066e" (UID: "a68f98b5-0226-4b20-a767-ead5e0af066e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.012325 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a68f98b5-0226-4b20-a767-ead5e0af066e" (UID: "a68f98b5-0226-4b20-a767-ead5e0af066e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.012639 4975 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.012662 4975 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.012645 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a68f98b5-0226-4b20-a767-ead5e0af066e" (UID: "a68f98b5-0226-4b20-a767-ead5e0af066e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.026665 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a68f98b5-0226-4b20-a767-ead5e0af066e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a68f98b5-0226-4b20-a767-ead5e0af066e" (UID: "a68f98b5-0226-4b20-a767-ead5e0af066e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.029167 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-kube-api-access-jjsnm" (OuterVolumeSpecName: "kube-api-access-jjsnm") pod "a68f98b5-0226-4b20-a767-ead5e0af066e" (UID: "a68f98b5-0226-4b20-a767-ead5e0af066e"). InnerVolumeSpecName "kube-api-access-jjsnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.029270 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a68f98b5-0226-4b20-a767-ead5e0af066e" (UID: "a68f98b5-0226-4b20-a767-ead5e0af066e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.034501 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a68f98b5-0226-4b20-a767-ead5e0af066e-pod-info" (OuterVolumeSpecName: "pod-info") pod "a68f98b5-0226-4b20-a767-ead5e0af066e" (UID: "a68f98b5-0226-4b20-a767-ead5e0af066e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.038095 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "a68f98b5-0226-4b20-a767-ead5e0af066e" (UID: "a68f98b5-0226-4b20-a767-ead5e0af066e"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.057855 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-config-data" (OuterVolumeSpecName: "config-data") pod "a68f98b5-0226-4b20-a767-ead5e0af066e" (UID: "a68f98b5-0226-4b20-a767-ead5e0af066e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.114481 4975 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.114532 4975 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.114546 4975 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.114559 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjsnm\" (UniqueName: \"kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-kube-api-access-jjsnm\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.114571 4975 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a68f98b5-0226-4b20-a767-ead5e0af066e-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.114581 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.114594 4975 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a68f98b5-0226-4b20-a767-ead5e0af066e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.179023 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-server-conf" (OuterVolumeSpecName: "server-conf") pod "a68f98b5-0226-4b20-a767-ead5e0af066e" (UID: "a68f98b5-0226-4b20-a767-ead5e0af066e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.217632 4975 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a68f98b5-0226-4b20-a767-ead5e0af066e-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.239418 4975 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.256903 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.287426 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a68f98b5-0226-4b20-a767-ead5e0af066e" (UID: "a68f98b5-0226-4b20-a767-ead5e0af066e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.319546 4975 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.319573 4975 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a68f98b5-0226-4b20-a767-ead5e0af066e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.839679 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a68f98b5-0226-4b20-a767-ead5e0af066e","Type":"ContainerDied","Data":"8ee5cab4f90f50e8148cb16309270981684ed5a8a7d49698c77cf7a6b2fc321a"} Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.840058 4975 scope.go:117] "RemoveContainer" containerID="5f6d0b95c66b9613d5726ca8079d04f6ce79946cd7d41024b803239fe09b32a1" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.839718 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.842382 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"df91a8b9-ed19-4f64-9d3a-2c93bae6916a","Type":"ContainerStarted","Data":"e0dfec74c0afec9a6d730725ba91ac4431b433a6382bc4e844d7dd9bb82292b6"} Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.861118 4975 scope.go:117] "RemoveContainer" containerID="f117753601c23acff04c84a90faed4c89539cd6b2706e3826b8124d9fc1ce0a0" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.875167 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.883458 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.904187 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:37:42 crc kubenswrapper[4975]: E0318 12:37:42.904566 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68f98b5-0226-4b20-a767-ead5e0af066e" containerName="rabbitmq" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.904584 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68f98b5-0226-4b20-a767-ead5e0af066e" containerName="rabbitmq" Mar 18 12:37:42 crc kubenswrapper[4975]: E0318 12:37:42.904611 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a68f98b5-0226-4b20-a767-ead5e0af066e" containerName="setup-container" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.904617 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="a68f98b5-0226-4b20-a767-ead5e0af066e" containerName="setup-container" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.904789 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="a68f98b5-0226-4b20-a767-ead5e0af066e" containerName="rabbitmq" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.905767 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.908731 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.909700 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.909954 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.910095 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.910272 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.910488 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.912314 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hmhxg" Mar 18 12:37:42 crc kubenswrapper[4975]: I0318 12:37:42.933287 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.026888 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a68f98b5-0226-4b20-a767-ead5e0af066e" path="/var/lib/kubelet/pods/a68f98b5-0226-4b20-a767-ead5e0af066e/volumes" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.027668 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f80aec-91ca-4aec-91b0-8e26f87ef0c5" path="/var/lib/kubelet/pods/c5f80aec-91ca-4aec-91b0-8e26f87ef0c5/volumes" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.031271 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f543aea-ed0e-412b-8f30-bc585ce1793e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.031321 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69t6j\" (UniqueName: \"kubernetes.io/projected/3f543aea-ed0e-412b-8f30-bc585ce1793e-kube-api-access-69t6j\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.031361 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f543aea-ed0e-412b-8f30-bc585ce1793e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.031408 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f543aea-ed0e-412b-8f30-bc585ce1793e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.031450 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f543aea-ed0e-412b-8f30-bc585ce1793e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.031507 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f543aea-ed0e-412b-8f30-bc585ce1793e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.031534 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f543aea-ed0e-412b-8f30-bc585ce1793e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.031624 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f543aea-ed0e-412b-8f30-bc585ce1793e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.031692 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3f543aea-ed0e-412b-8f30-bc585ce1793e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.031732 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f543aea-ed0e-412b-8f30-bc585ce1793e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.031768 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.133844 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f543aea-ed0e-412b-8f30-bc585ce1793e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.133904 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3f543aea-ed0e-412b-8f30-bc585ce1793e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.133939 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f543aea-ed0e-412b-8f30-bc585ce1793e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.133966 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.133997 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f543aea-ed0e-412b-8f30-bc585ce1793e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.134209 4975 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.134947 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f543aea-ed0e-412b-8f30-bc585ce1793e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.134991 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69t6j\" (UniqueName: \"kubernetes.io/projected/3f543aea-ed0e-412b-8f30-bc585ce1793e-kube-api-access-69t6j\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.135022 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f543aea-ed0e-412b-8f30-bc585ce1793e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.135063 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f543aea-ed0e-412b-8f30-bc585ce1793e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.135679 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f543aea-ed0e-412b-8f30-bc585ce1793e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.135734 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f543aea-ed0e-412b-8f30-bc585ce1793e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.135760 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f543aea-ed0e-412b-8f30-bc585ce1793e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.135622 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f543aea-ed0e-412b-8f30-bc585ce1793e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.135224 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f543aea-ed0e-412b-8f30-bc585ce1793e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.136634 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f543aea-ed0e-412b-8f30-bc585ce1793e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.135224 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f543aea-ed0e-412b-8f30-bc585ce1793e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.164089 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f543aea-ed0e-412b-8f30-bc585ce1793e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.164226 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f543aea-ed0e-412b-8f30-bc585ce1793e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.164348 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69t6j\" (UniqueName: \"kubernetes.io/projected/3f543aea-ed0e-412b-8f30-bc585ce1793e-kube-api-access-69t6j\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.164628 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3f543aea-ed0e-412b-8f30-bc585ce1793e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.164688 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f543aea-ed0e-412b-8f30-bc585ce1793e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.212719 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f543aea-ed0e-412b-8f30-bc585ce1793e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.225778 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.693851 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.853894 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3f543aea-ed0e-412b-8f30-bc585ce1793e","Type":"ContainerStarted","Data":"e707844316f6ad344ad70afe83bb422e850611e3722e18a76942dddd2e759473"} Mar 18 12:37:43 crc kubenswrapper[4975]: I0318 12:37:43.855188 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"df91a8b9-ed19-4f64-9d3a-2c93bae6916a","Type":"ContainerStarted","Data":"f98d88918cfa30eb8ca5c77f7ba6dc4f18cfe114cf565586a334def5a70dd18c"} Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.018018 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:37:44 crc kubenswrapper[4975]: E0318 12:37:44.018623 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.037502 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-sk68k"] Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.039333 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.041470 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.051069 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-sk68k"] Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.157737 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.157812 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.157834 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kjnk\" (UniqueName: \"kubernetes.io/projected/19617669-144c-4501-b1f2-16bafb19a3ae-kube-api-access-5kjnk\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.158196 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-config\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.158293 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.158370 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.158470 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.261573 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-config\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.261880 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.262045 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.262222 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.262427 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.262502 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kjnk\" (UniqueName: \"kubernetes.io/projected/19617669-144c-4501-b1f2-16bafb19a3ae-kube-api-access-5kjnk\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.262469 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-config\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.262576 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.262814 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.262965 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.263097 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.263151 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.263345 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.277541 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kjnk\" (UniqueName: \"kubernetes.io/projected/19617669-144c-4501-b1f2-16bafb19a3ae-kube-api-access-5kjnk\") pod \"dnsmasq-dns-79bd4cc8c9-sk68k\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.355295 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:44 crc kubenswrapper[4975]: I0318 12:37:44.773162 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-sk68k"] Mar 18 12:37:44 crc kubenswrapper[4975]: W0318 12:37:44.867256 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19617669_144c_4501_b1f2_16bafb19a3ae.slice/crio-7fd50accf02fe09fac93a4d0d6e0f4f3caaee23f7efe69e302a6b266cc86dd59 WatchSource:0}: Error finding container 7fd50accf02fe09fac93a4d0d6e0f4f3caaee23f7efe69e302a6b266cc86dd59: Status 404 returned error can't find the container with id 7fd50accf02fe09fac93a4d0d6e0f4f3caaee23f7efe69e302a6b266cc86dd59 Mar 18 12:37:45 crc kubenswrapper[4975]: I0318 12:37:45.874744 4975 generic.go:334] "Generic (PLEG): container finished" podID="19617669-144c-4501-b1f2-16bafb19a3ae" containerID="94edc27d04bef47354175082c520d6efbfe7dd10d0bfb3f73061afbec16c4d69" exitCode=0 Mar 18 12:37:45 crc kubenswrapper[4975]: I0318 12:37:45.874887 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" event={"ID":"19617669-144c-4501-b1f2-16bafb19a3ae","Type":"ContainerDied","Data":"94edc27d04bef47354175082c520d6efbfe7dd10d0bfb3f73061afbec16c4d69"} Mar 18 12:37:45 crc kubenswrapper[4975]: I0318 12:37:45.874924 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" event={"ID":"19617669-144c-4501-b1f2-16bafb19a3ae","Type":"ContainerStarted","Data":"7fd50accf02fe09fac93a4d0d6e0f4f3caaee23f7efe69e302a6b266cc86dd59"} Mar 18 12:37:45 crc kubenswrapper[4975]: I0318 12:37:45.878739 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3f543aea-ed0e-412b-8f30-bc585ce1793e","Type":"ContainerStarted","Data":"c73c14f59f0e0447743bffeb0ddf35d1e5d90d9a4ab7ab0b511a64a66b391a33"} Mar 18 12:37:46 crc kubenswrapper[4975]: I0318 12:37:46.892259 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" event={"ID":"19617669-144c-4501-b1f2-16bafb19a3ae","Type":"ContainerStarted","Data":"93280219e1c0d8cbaf991db317c5b608e09a8ce64958122bbf97f12592c75b93"} Mar 18 12:37:46 crc kubenswrapper[4975]: I0318 12:37:46.916735 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" podStartSLOduration=2.916713598 podStartE2EDuration="2.916713598s" podCreationTimestamp="2026-03-18 12:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:37:46.912550734 +0000 UTC m=+1652.626951323" watchObservedRunningTime="2026-03-18 12:37:46.916713598 +0000 UTC m=+1652.631114177" Mar 18 12:37:47 crc kubenswrapper[4975]: I0318 12:37:47.900906 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.357174 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.456541 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-7vq7j"] Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.456816 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" podUID="0af5d44d-bf9c-4cd9-9775-30824349df84" containerName="dnsmasq-dns" containerID="cri-o://c174c09bca5c9841f1c0ab5d5a69891a1ad78d4d328a96c55bda38f79da84298" gracePeriod=10 Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.675156 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-wrsq2"] Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.677663 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.711012 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-wrsq2"] Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.860693 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79fdn\" (UniqueName: \"kubernetes.io/projected/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-kube-api-access-79fdn\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.860765 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.860802 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.860849 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-config\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.860929 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.860979 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.861038 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.963096 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.963173 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.963228 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79fdn\" (UniqueName: \"kubernetes.io/projected/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-kube-api-access-79fdn\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.963257 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.963278 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.963344 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-config\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.963399 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.964336 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.964489 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.964802 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.965110 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.965190 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-config\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.965428 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.967151 4975 generic.go:334] "Generic (PLEG): container finished" podID="0af5d44d-bf9c-4cd9-9775-30824349df84" containerID="c174c09bca5c9841f1c0ab5d5a69891a1ad78d4d328a96c55bda38f79da84298" exitCode=0 Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.967226 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" event={"ID":"0af5d44d-bf9c-4cd9-9775-30824349df84","Type":"ContainerDied","Data":"c174c09bca5c9841f1c0ab5d5a69891a1ad78d4d328a96c55bda38f79da84298"} Mar 18 12:37:54 crc kubenswrapper[4975]: I0318 12:37:54.987893 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79fdn\" (UniqueName: \"kubernetes.io/projected/09a27227-8777-4ddc-b4a0-ca2c7f8e66bf-kube-api-access-79fdn\") pod \"dnsmasq-dns-54ffdb7d8c-wrsq2\" (UID: \"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.009791 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.138004 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.270288 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-ovsdbserver-sb\") pod \"0af5d44d-bf9c-4cd9-9775-30824349df84\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.270360 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-ovsdbserver-nb\") pod \"0af5d44d-bf9c-4cd9-9775-30824349df84\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.270447 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-dns-svc\") pod \"0af5d44d-bf9c-4cd9-9775-30824349df84\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.270539 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-dns-swift-storage-0\") pod \"0af5d44d-bf9c-4cd9-9775-30824349df84\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.270564 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-config\") pod \"0af5d44d-bf9c-4cd9-9775-30824349df84\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.270607 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfsj6\" (UniqueName: \"kubernetes.io/projected/0af5d44d-bf9c-4cd9-9775-30824349df84-kube-api-access-kfsj6\") pod \"0af5d44d-bf9c-4cd9-9775-30824349df84\" (UID: \"0af5d44d-bf9c-4cd9-9775-30824349df84\") " Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.275690 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0af5d44d-bf9c-4cd9-9775-30824349df84-kube-api-access-kfsj6" (OuterVolumeSpecName: "kube-api-access-kfsj6") pod "0af5d44d-bf9c-4cd9-9775-30824349df84" (UID: "0af5d44d-bf9c-4cd9-9775-30824349df84"). InnerVolumeSpecName "kube-api-access-kfsj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.334443 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0af5d44d-bf9c-4cd9-9775-30824349df84" (UID: "0af5d44d-bf9c-4cd9-9775-30824349df84"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.334510 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0af5d44d-bf9c-4cd9-9775-30824349df84" (UID: "0af5d44d-bf9c-4cd9-9775-30824349df84"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.337048 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0af5d44d-bf9c-4cd9-9775-30824349df84" (UID: "0af5d44d-bf9c-4cd9-9775-30824349df84"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.346932 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-config" (OuterVolumeSpecName: "config") pod "0af5d44d-bf9c-4cd9-9775-30824349df84" (UID: "0af5d44d-bf9c-4cd9-9775-30824349df84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.359424 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0af5d44d-bf9c-4cd9-9775-30824349df84" (UID: "0af5d44d-bf9c-4cd9-9775-30824349df84"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.373501 4975 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.373541 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.373554 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfsj6\" (UniqueName: \"kubernetes.io/projected/0af5d44d-bf9c-4cd9-9775-30824349df84-kube-api-access-kfsj6\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.373567 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.373576 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.373584 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0af5d44d-bf9c-4cd9-9775-30824349df84-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:37:55 crc kubenswrapper[4975]: W0318 12:37:55.509314 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09a27227_8777_4ddc_b4a0_ca2c7f8e66bf.slice/crio-b152bdd601e6b6bea87b3a4cb2b704d09cef7be97a657bd567ccdc04eb0a7e9c WatchSource:0}: Error finding container b152bdd601e6b6bea87b3a4cb2b704d09cef7be97a657bd567ccdc04eb0a7e9c: Status 404 returned error can't find the container with id b152bdd601e6b6bea87b3a4cb2b704d09cef7be97a657bd567ccdc04eb0a7e9c Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.511906 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-wrsq2"] Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.978852 4975 generic.go:334] "Generic (PLEG): container finished" podID="09a27227-8777-4ddc-b4a0-ca2c7f8e66bf" containerID="05f8cb55e6d1ab3cd69220f3e06001208ea95864ee9ed2943833d592bf8eb0f6" exitCode=0 Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.979015 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" event={"ID":"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf","Type":"ContainerDied","Data":"05f8cb55e6d1ab3cd69220f3e06001208ea95864ee9ed2943833d592bf8eb0f6"} Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.979055 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" event={"ID":"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf","Type":"ContainerStarted","Data":"b152bdd601e6b6bea87b3a4cb2b704d09cef7be97a657bd567ccdc04eb0a7e9c"} Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.982844 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" event={"ID":"0af5d44d-bf9c-4cd9-9775-30824349df84","Type":"ContainerDied","Data":"3b41fc5d0b9148d807e032790510ce256fac9fc7b56df3762933da034e5b3a1f"} Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.982996 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-7vq7j" Mar 18 12:37:55 crc kubenswrapper[4975]: I0318 12:37:55.983007 4975 scope.go:117] "RemoveContainer" containerID="c174c09bca5c9841f1c0ab5d5a69891a1ad78d4d328a96c55bda38f79da84298" Mar 18 12:37:56 crc kubenswrapper[4975]: I0318 12:37:56.149637 4975 scope.go:117] "RemoveContainer" containerID="8535b4a54982e784365789172f26aec03dcd25215182e6f20edf061f0bb417d6" Mar 18 12:37:56 crc kubenswrapper[4975]: I0318 12:37:56.178267 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-7vq7j"] Mar 18 12:37:56 crc kubenswrapper[4975]: I0318 12:37:56.191052 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-7vq7j"] Mar 18 12:37:56 crc kubenswrapper[4975]: I0318 12:37:56.995678 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" event={"ID":"09a27227-8777-4ddc-b4a0-ca2c7f8e66bf","Type":"ContainerStarted","Data":"149025c00513c5be1e8a86195f27023064a158414945afb1b81990fe338ef810"} Mar 18 12:37:56 crc kubenswrapper[4975]: I0318 12:37:56.995905 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:37:57 crc kubenswrapper[4975]: I0318 12:37:57.018555 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" podStartSLOduration=3.018535273 podStartE2EDuration="3.018535273s" podCreationTimestamp="2026-03-18 12:37:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:37:57.01442625 +0000 UTC m=+1662.728826879" watchObservedRunningTime="2026-03-18 12:37:57.018535273 +0000 UTC m=+1662.732935852" Mar 18 12:37:57 crc kubenswrapper[4975]: I0318 12:37:57.029734 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0af5d44d-bf9c-4cd9-9775-30824349df84" path="/var/lib/kubelet/pods/0af5d44d-bf9c-4cd9-9775-30824349df84/volumes" Mar 18 12:37:59 crc kubenswrapper[4975]: I0318 12:37:59.016193 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:37:59 crc kubenswrapper[4975]: E0318 12:37:59.017603 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:38:00 crc kubenswrapper[4975]: I0318 12:38:00.147440 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563958-jnnzc"] Mar 18 12:38:00 crc kubenswrapper[4975]: E0318 12:38:00.147839 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0af5d44d-bf9c-4cd9-9775-30824349df84" containerName="dnsmasq-dns" Mar 18 12:38:00 crc kubenswrapper[4975]: I0318 12:38:00.147851 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af5d44d-bf9c-4cd9-9775-30824349df84" containerName="dnsmasq-dns" Mar 18 12:38:00 crc kubenswrapper[4975]: E0318 12:38:00.147886 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0af5d44d-bf9c-4cd9-9775-30824349df84" containerName="init" Mar 18 12:38:00 crc kubenswrapper[4975]: I0318 12:38:00.147892 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af5d44d-bf9c-4cd9-9775-30824349df84" containerName="init" Mar 18 12:38:00 crc kubenswrapper[4975]: I0318 12:38:00.148113 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="0af5d44d-bf9c-4cd9-9775-30824349df84" containerName="dnsmasq-dns" Mar 18 12:38:00 crc kubenswrapper[4975]: I0318 12:38:00.148695 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563958-jnnzc" Mar 18 12:38:00 crc kubenswrapper[4975]: I0318 12:38:00.159998 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:38:00 crc kubenswrapper[4975]: I0318 12:38:00.162165 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:38:00 crc kubenswrapper[4975]: I0318 12:38:00.164185 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:38:00 crc kubenswrapper[4975]: I0318 12:38:00.171212 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563958-jnnzc"] Mar 18 12:38:00 crc kubenswrapper[4975]: I0318 12:38:00.190032 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jd8b\" (UniqueName: \"kubernetes.io/projected/3711f736-ca1c-46c8-950a-4472f3dbc6b9-kube-api-access-5jd8b\") pod \"auto-csr-approver-29563958-jnnzc\" (UID: \"3711f736-ca1c-46c8-950a-4472f3dbc6b9\") " pod="openshift-infra/auto-csr-approver-29563958-jnnzc" Mar 18 12:38:00 crc kubenswrapper[4975]: I0318 12:38:00.291912 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jd8b\" (UniqueName: \"kubernetes.io/projected/3711f736-ca1c-46c8-950a-4472f3dbc6b9-kube-api-access-5jd8b\") pod \"auto-csr-approver-29563958-jnnzc\" (UID: \"3711f736-ca1c-46c8-950a-4472f3dbc6b9\") " pod="openshift-infra/auto-csr-approver-29563958-jnnzc" Mar 18 12:38:00 crc kubenswrapper[4975]: I0318 12:38:00.313025 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jd8b\" (UniqueName: \"kubernetes.io/projected/3711f736-ca1c-46c8-950a-4472f3dbc6b9-kube-api-access-5jd8b\") pod \"auto-csr-approver-29563958-jnnzc\" (UID: \"3711f736-ca1c-46c8-950a-4472f3dbc6b9\") " pod="openshift-infra/auto-csr-approver-29563958-jnnzc" Mar 18 12:38:00 crc kubenswrapper[4975]: I0318 12:38:00.480684 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563958-jnnzc" Mar 18 12:38:00 crc kubenswrapper[4975]: I0318 12:38:00.956699 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563958-jnnzc"] Mar 18 12:38:01 crc kubenswrapper[4975]: I0318 12:38:01.033607 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563958-jnnzc" event={"ID":"3711f736-ca1c-46c8-950a-4472f3dbc6b9","Type":"ContainerStarted","Data":"2fd84a1e981cb0b26aa493e2fc545ca7cc626eec4b4e598b13c6a86d62e05362"} Mar 18 12:38:03 crc kubenswrapper[4975]: I0318 12:38:03.052429 4975 generic.go:334] "Generic (PLEG): container finished" podID="3711f736-ca1c-46c8-950a-4472f3dbc6b9" containerID="3188344a6de3445fb78743c98a82e24f09976897da649a4069a44cbccedaec81" exitCode=0 Mar 18 12:38:03 crc kubenswrapper[4975]: I0318 12:38:03.052577 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563958-jnnzc" event={"ID":"3711f736-ca1c-46c8-950a-4472f3dbc6b9","Type":"ContainerDied","Data":"3188344a6de3445fb78743c98a82e24f09976897da649a4069a44cbccedaec81"} Mar 18 12:38:04 crc kubenswrapper[4975]: I0318 12:38:04.423151 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563958-jnnzc" Mar 18 12:38:04 crc kubenswrapper[4975]: I0318 12:38:04.574095 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jd8b\" (UniqueName: \"kubernetes.io/projected/3711f736-ca1c-46c8-950a-4472f3dbc6b9-kube-api-access-5jd8b\") pod \"3711f736-ca1c-46c8-950a-4472f3dbc6b9\" (UID: \"3711f736-ca1c-46c8-950a-4472f3dbc6b9\") " Mar 18 12:38:04 crc kubenswrapper[4975]: I0318 12:38:04.582829 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3711f736-ca1c-46c8-950a-4472f3dbc6b9-kube-api-access-5jd8b" (OuterVolumeSpecName: "kube-api-access-5jd8b") pod "3711f736-ca1c-46c8-950a-4472f3dbc6b9" (UID: "3711f736-ca1c-46c8-950a-4472f3dbc6b9"). InnerVolumeSpecName "kube-api-access-5jd8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:38:04 crc kubenswrapper[4975]: I0318 12:38:04.676898 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jd8b\" (UniqueName: \"kubernetes.io/projected/3711f736-ca1c-46c8-950a-4472f3dbc6b9-kube-api-access-5jd8b\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.011036 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54ffdb7d8c-wrsq2" Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.074018 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-sk68k"] Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.074285 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" podUID="19617669-144c-4501-b1f2-16bafb19a3ae" containerName="dnsmasq-dns" containerID="cri-o://93280219e1c0d8cbaf991db317c5b608e09a8ce64958122bbf97f12592c75b93" gracePeriod=10 Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.116661 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563958-jnnzc" event={"ID":"3711f736-ca1c-46c8-950a-4472f3dbc6b9","Type":"ContainerDied","Data":"2fd84a1e981cb0b26aa493e2fc545ca7cc626eec4b4e598b13c6a86d62e05362"} Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.116716 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fd84a1e981cb0b26aa493e2fc545ca7cc626eec4b4e598b13c6a86d62e05362" Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.116781 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563958-jnnzc" Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.502884 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563952-vwzs4"] Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.511653 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563952-vwzs4"] Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.602529 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.797486 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kjnk\" (UniqueName: \"kubernetes.io/projected/19617669-144c-4501-b1f2-16bafb19a3ae-kube-api-access-5kjnk\") pod \"19617669-144c-4501-b1f2-16bafb19a3ae\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.797662 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-ovsdbserver-sb\") pod \"19617669-144c-4501-b1f2-16bafb19a3ae\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.797716 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-dns-swift-storage-0\") pod \"19617669-144c-4501-b1f2-16bafb19a3ae\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.797795 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-ovsdbserver-nb\") pod \"19617669-144c-4501-b1f2-16bafb19a3ae\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.797952 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-config\") pod \"19617669-144c-4501-b1f2-16bafb19a3ae\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.797985 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-dns-svc\") pod \"19617669-144c-4501-b1f2-16bafb19a3ae\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.798036 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-openstack-edpm-ipam\") pod \"19617669-144c-4501-b1f2-16bafb19a3ae\" (UID: \"19617669-144c-4501-b1f2-16bafb19a3ae\") " Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.804153 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19617669-144c-4501-b1f2-16bafb19a3ae-kube-api-access-5kjnk" (OuterVolumeSpecName: "kube-api-access-5kjnk") pod "19617669-144c-4501-b1f2-16bafb19a3ae" (UID: "19617669-144c-4501-b1f2-16bafb19a3ae"). InnerVolumeSpecName "kube-api-access-5kjnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.900078 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kjnk\" (UniqueName: \"kubernetes.io/projected/19617669-144c-4501-b1f2-16bafb19a3ae-kube-api-access-5kjnk\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.982465 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "19617669-144c-4501-b1f2-16bafb19a3ae" (UID: "19617669-144c-4501-b1f2-16bafb19a3ae"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:38:05 crc kubenswrapper[4975]: I0318 12:38:05.984100 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-config" (OuterVolumeSpecName: "config") pod "19617669-144c-4501-b1f2-16bafb19a3ae" (UID: "19617669-144c-4501-b1f2-16bafb19a3ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.001984 4975 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-config\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.002012 4975 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.007589 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "19617669-144c-4501-b1f2-16bafb19a3ae" (UID: "19617669-144c-4501-b1f2-16bafb19a3ae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.014639 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "19617669-144c-4501-b1f2-16bafb19a3ae" (UID: "19617669-144c-4501-b1f2-16bafb19a3ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.022475 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "19617669-144c-4501-b1f2-16bafb19a3ae" (UID: "19617669-144c-4501-b1f2-16bafb19a3ae"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.025062 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "19617669-144c-4501-b1f2-16bafb19a3ae" (UID: "19617669-144c-4501-b1f2-16bafb19a3ae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.103255 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.103281 4975 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.103290 4975 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.103300 4975 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19617669-144c-4501-b1f2-16bafb19a3ae-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.132030 4975 generic.go:334] "Generic (PLEG): container finished" podID="19617669-144c-4501-b1f2-16bafb19a3ae" containerID="93280219e1c0d8cbaf991db317c5b608e09a8ce64958122bbf97f12592c75b93" exitCode=0 Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.132086 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" event={"ID":"19617669-144c-4501-b1f2-16bafb19a3ae","Type":"ContainerDied","Data":"93280219e1c0d8cbaf991db317c5b608e09a8ce64958122bbf97f12592c75b93"} Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.132169 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" event={"ID":"19617669-144c-4501-b1f2-16bafb19a3ae","Type":"ContainerDied","Data":"7fd50accf02fe09fac93a4d0d6e0f4f3caaee23f7efe69e302a6b266cc86dd59"} Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.132193 4975 scope.go:117] "RemoveContainer" containerID="93280219e1c0d8cbaf991db317c5b608e09a8ce64958122bbf97f12592c75b93" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.132115 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-sk68k" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.169457 4975 scope.go:117] "RemoveContainer" containerID="94edc27d04bef47354175082c520d6efbfe7dd10d0bfb3f73061afbec16c4d69" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.174923 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-sk68k"] Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.191320 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-sk68k"] Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.202617 4975 scope.go:117] "RemoveContainer" containerID="93280219e1c0d8cbaf991db317c5b608e09a8ce64958122bbf97f12592c75b93" Mar 18 12:38:06 crc kubenswrapper[4975]: E0318 12:38:06.203518 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93280219e1c0d8cbaf991db317c5b608e09a8ce64958122bbf97f12592c75b93\": container with ID starting with 93280219e1c0d8cbaf991db317c5b608e09a8ce64958122bbf97f12592c75b93 not found: ID does not exist" containerID="93280219e1c0d8cbaf991db317c5b608e09a8ce64958122bbf97f12592c75b93" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.203672 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93280219e1c0d8cbaf991db317c5b608e09a8ce64958122bbf97f12592c75b93"} err="failed to get container status \"93280219e1c0d8cbaf991db317c5b608e09a8ce64958122bbf97f12592c75b93\": rpc error: code = NotFound desc = could not find container \"93280219e1c0d8cbaf991db317c5b608e09a8ce64958122bbf97f12592c75b93\": container with ID starting with 93280219e1c0d8cbaf991db317c5b608e09a8ce64958122bbf97f12592c75b93 not found: ID does not exist" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.203735 4975 scope.go:117] "RemoveContainer" containerID="94edc27d04bef47354175082c520d6efbfe7dd10d0bfb3f73061afbec16c4d69" Mar 18 12:38:06 crc kubenswrapper[4975]: E0318 12:38:06.204160 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94edc27d04bef47354175082c520d6efbfe7dd10d0bfb3f73061afbec16c4d69\": container with ID starting with 94edc27d04bef47354175082c520d6efbfe7dd10d0bfb3f73061afbec16c4d69 not found: ID does not exist" containerID="94edc27d04bef47354175082c520d6efbfe7dd10d0bfb3f73061afbec16c4d69" Mar 18 12:38:06 crc kubenswrapper[4975]: I0318 12:38:06.204341 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94edc27d04bef47354175082c520d6efbfe7dd10d0bfb3f73061afbec16c4d69"} err="failed to get container status \"94edc27d04bef47354175082c520d6efbfe7dd10d0bfb3f73061afbec16c4d69\": rpc error: code = NotFound desc = could not find container \"94edc27d04bef47354175082c520d6efbfe7dd10d0bfb3f73061afbec16c4d69\": container with ID starting with 94edc27d04bef47354175082c520d6efbfe7dd10d0bfb3f73061afbec16c4d69 not found: ID does not exist" Mar 18 12:38:07 crc kubenswrapper[4975]: I0318 12:38:07.040049 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19617669-144c-4501-b1f2-16bafb19a3ae" path="/var/lib/kubelet/pods/19617669-144c-4501-b1f2-16bafb19a3ae/volumes" Mar 18 12:38:07 crc kubenswrapper[4975]: I0318 12:38:07.042652 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a3a6b1-d7fd-4d05-a075-19aa9585b87d" path="/var/lib/kubelet/pods/23a3a6b1-d7fd-4d05-a075-19aa9585b87d/volumes" Mar 18 12:38:13 crc kubenswrapper[4975]: I0318 12:38:13.016920 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:38:13 crc kubenswrapper[4975]: E0318 12:38:13.017664 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:38:16 crc kubenswrapper[4975]: I0318 12:38:16.230325 4975 generic.go:334] "Generic (PLEG): container finished" podID="df91a8b9-ed19-4f64-9d3a-2c93bae6916a" containerID="f98d88918cfa30eb8ca5c77f7ba6dc4f18cfe114cf565586a334def5a70dd18c" exitCode=0 Mar 18 12:38:16 crc kubenswrapper[4975]: I0318 12:38:16.230429 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"df91a8b9-ed19-4f64-9d3a-2c93bae6916a","Type":"ContainerDied","Data":"f98d88918cfa30eb8ca5c77f7ba6dc4f18cfe114cf565586a334def5a70dd18c"} Mar 18 12:38:17 crc kubenswrapper[4975]: I0318 12:38:17.242067 4975 generic.go:334] "Generic (PLEG): container finished" podID="3f543aea-ed0e-412b-8f30-bc585ce1793e" containerID="c73c14f59f0e0447743bffeb0ddf35d1e5d90d9a4ab7ab0b511a64a66b391a33" exitCode=0 Mar 18 12:38:17 crc kubenswrapper[4975]: I0318 12:38:17.242158 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3f543aea-ed0e-412b-8f30-bc585ce1793e","Type":"ContainerDied","Data":"c73c14f59f0e0447743bffeb0ddf35d1e5d90d9a4ab7ab0b511a64a66b391a33"} Mar 18 12:38:17 crc kubenswrapper[4975]: I0318 12:38:17.246531 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"df91a8b9-ed19-4f64-9d3a-2c93bae6916a","Type":"ContainerStarted","Data":"74df327bf2082f6e30db8b47e26e71146c68d7b0a79076ba0ba885c9c63809bf"} Mar 18 12:38:17 crc kubenswrapper[4975]: I0318 12:38:17.246917 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 12:38:17 crc kubenswrapper[4975]: I0318 12:38:17.297571 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.297530142 podStartE2EDuration="36.297530142s" podCreationTimestamp="2026-03-18 12:37:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:38:17.29340493 +0000 UTC m=+1683.007805519" watchObservedRunningTime="2026-03-18 12:38:17.297530142 +0000 UTC m=+1683.011930721" Mar 18 12:38:18 crc kubenswrapper[4975]: I0318 12:38:18.257697 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3f543aea-ed0e-412b-8f30-bc585ce1793e","Type":"ContainerStarted","Data":"4ce1b1eb020e34dbab64d4e6d6d277d133a30e8c31ceb64439d017771f64f80b"} Mar 18 12:38:18 crc kubenswrapper[4975]: I0318 12:38:18.258339 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:38:22 crc kubenswrapper[4975]: I0318 12:38:22.872494 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.872475797999996 podStartE2EDuration="40.872475798s" podCreationTimestamp="2026-03-18 12:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 12:38:18.28967668 +0000 UTC m=+1684.004077269" watchObservedRunningTime="2026-03-18 12:38:22.872475798 +0000 UTC m=+1688.586876377" Mar 18 12:38:22 crc kubenswrapper[4975]: I0318 12:38:22.880751 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h9c7v"] Mar 18 12:38:22 crc kubenswrapper[4975]: E0318 12:38:22.881213 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3711f736-ca1c-46c8-950a-4472f3dbc6b9" containerName="oc" Mar 18 12:38:22 crc kubenswrapper[4975]: I0318 12:38:22.881231 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="3711f736-ca1c-46c8-950a-4472f3dbc6b9" containerName="oc" Mar 18 12:38:22 crc kubenswrapper[4975]: E0318 12:38:22.881262 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19617669-144c-4501-b1f2-16bafb19a3ae" containerName="init" Mar 18 12:38:22 crc kubenswrapper[4975]: I0318 12:38:22.881270 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="19617669-144c-4501-b1f2-16bafb19a3ae" containerName="init" Mar 18 12:38:22 crc kubenswrapper[4975]: E0318 12:38:22.881281 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19617669-144c-4501-b1f2-16bafb19a3ae" containerName="dnsmasq-dns" Mar 18 12:38:22 crc kubenswrapper[4975]: I0318 12:38:22.881287 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="19617669-144c-4501-b1f2-16bafb19a3ae" containerName="dnsmasq-dns" Mar 18 12:38:22 crc kubenswrapper[4975]: I0318 12:38:22.881471 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="3711f736-ca1c-46c8-950a-4472f3dbc6b9" containerName="oc" Mar 18 12:38:22 crc kubenswrapper[4975]: I0318 12:38:22.881486 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="19617669-144c-4501-b1f2-16bafb19a3ae" containerName="dnsmasq-dns" Mar 18 12:38:22 crc kubenswrapper[4975]: I0318 12:38:22.882977 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:22 crc kubenswrapper[4975]: I0318 12:38:22.905217 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h9c7v"] Mar 18 12:38:23 crc kubenswrapper[4975]: I0318 12:38:23.053386 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386cf621-0ac9-43ef-bacc-c00c1b276e63-catalog-content\") pod \"redhat-marketplace-h9c7v\" (UID: \"386cf621-0ac9-43ef-bacc-c00c1b276e63\") " pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:23 crc kubenswrapper[4975]: I0318 12:38:23.053718 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386cf621-0ac9-43ef-bacc-c00c1b276e63-utilities\") pod \"redhat-marketplace-h9c7v\" (UID: \"386cf621-0ac9-43ef-bacc-c00c1b276e63\") " pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:23 crc kubenswrapper[4975]: I0318 12:38:23.053928 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsfm4\" (UniqueName: \"kubernetes.io/projected/386cf621-0ac9-43ef-bacc-c00c1b276e63-kube-api-access-dsfm4\") pod \"redhat-marketplace-h9c7v\" (UID: \"386cf621-0ac9-43ef-bacc-c00c1b276e63\") " pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:23 crc kubenswrapper[4975]: I0318 12:38:23.155742 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386cf621-0ac9-43ef-bacc-c00c1b276e63-utilities\") pod \"redhat-marketplace-h9c7v\" (UID: \"386cf621-0ac9-43ef-bacc-c00c1b276e63\") " pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:23 crc kubenswrapper[4975]: I0318 12:38:23.155878 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsfm4\" (UniqueName: \"kubernetes.io/projected/386cf621-0ac9-43ef-bacc-c00c1b276e63-kube-api-access-dsfm4\") pod \"redhat-marketplace-h9c7v\" (UID: \"386cf621-0ac9-43ef-bacc-c00c1b276e63\") " pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:23 crc kubenswrapper[4975]: I0318 12:38:23.155942 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386cf621-0ac9-43ef-bacc-c00c1b276e63-catalog-content\") pod \"redhat-marketplace-h9c7v\" (UID: \"386cf621-0ac9-43ef-bacc-c00c1b276e63\") " pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:23 crc kubenswrapper[4975]: I0318 12:38:23.156341 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386cf621-0ac9-43ef-bacc-c00c1b276e63-utilities\") pod \"redhat-marketplace-h9c7v\" (UID: \"386cf621-0ac9-43ef-bacc-c00c1b276e63\") " pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:23 crc kubenswrapper[4975]: I0318 12:38:23.156409 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386cf621-0ac9-43ef-bacc-c00c1b276e63-catalog-content\") pod \"redhat-marketplace-h9c7v\" (UID: \"386cf621-0ac9-43ef-bacc-c00c1b276e63\") " pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:23 crc kubenswrapper[4975]: I0318 12:38:23.178200 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsfm4\" (UniqueName: \"kubernetes.io/projected/386cf621-0ac9-43ef-bacc-c00c1b276e63-kube-api-access-dsfm4\") pod \"redhat-marketplace-h9c7v\" (UID: \"386cf621-0ac9-43ef-bacc-c00c1b276e63\") " pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:23 crc kubenswrapper[4975]: I0318 12:38:23.201331 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:23 crc kubenswrapper[4975]: I0318 12:38:23.693932 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h9c7v"] Mar 18 12:38:24 crc kubenswrapper[4975]: I0318 12:38:24.017699 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:38:24 crc kubenswrapper[4975]: E0318 12:38:24.018248 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:38:24 crc kubenswrapper[4975]: I0318 12:38:24.308725 4975 generic.go:334] "Generic (PLEG): container finished" podID="386cf621-0ac9-43ef-bacc-c00c1b276e63" containerID="9c1a41f265c9a2517e72e9d570d19c9a98e33696d8dc8b449752652e6435e3b2" exitCode=0 Mar 18 12:38:24 crc kubenswrapper[4975]: I0318 12:38:24.308771 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h9c7v" event={"ID":"386cf621-0ac9-43ef-bacc-c00c1b276e63","Type":"ContainerDied","Data":"9c1a41f265c9a2517e72e9d570d19c9a98e33696d8dc8b449752652e6435e3b2"} Mar 18 12:38:24 crc kubenswrapper[4975]: I0318 12:38:24.308821 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h9c7v" event={"ID":"386cf621-0ac9-43ef-bacc-c00c1b276e63","Type":"ContainerStarted","Data":"73c3d6ddbf1a7e9d5b676099dccbfdef6e650df48139a56784edef5e14b81722"} Mar 18 12:38:25 crc kubenswrapper[4975]: I0318 12:38:25.321523 4975 generic.go:334] "Generic (PLEG): container finished" podID="386cf621-0ac9-43ef-bacc-c00c1b276e63" containerID="37259ac03af6ade97903e8d39053b118ef191958f4cee9e6b9982a5c997b8680" exitCode=0 Mar 18 12:38:25 crc kubenswrapper[4975]: I0318 12:38:25.321598 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h9c7v" event={"ID":"386cf621-0ac9-43ef-bacc-c00c1b276e63","Type":"ContainerDied","Data":"37259ac03af6ade97903e8d39053b118ef191958f4cee9e6b9982a5c997b8680"} Mar 18 12:38:26 crc kubenswrapper[4975]: I0318 12:38:26.331462 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h9c7v" event={"ID":"386cf621-0ac9-43ef-bacc-c00c1b276e63","Type":"ContainerStarted","Data":"062fd6d64945ab2df4f6de499514ccaf610d97252d794d93ff2d200182f49dd5"} Mar 18 12:38:26 crc kubenswrapper[4975]: I0318 12:38:26.351583 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h9c7v" podStartSLOduration=2.7465066240000002 podStartE2EDuration="4.351557934s" podCreationTimestamp="2026-03-18 12:38:22 +0000 UTC" firstStartedPulling="2026-03-18 12:38:24.310456405 +0000 UTC m=+1690.024856984" lastFinishedPulling="2026-03-18 12:38:25.915507715 +0000 UTC m=+1691.629908294" observedRunningTime="2026-03-18 12:38:26.34482691 +0000 UTC m=+1692.059227489" watchObservedRunningTime="2026-03-18 12:38:26.351557934 +0000 UTC m=+1692.065958513" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.615184 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd"] Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.616591 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.618715 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.619204 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.619379 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.619414 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.635235 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd"] Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.783515 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdsqz\" (UniqueName: \"kubernetes.io/projected/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-kube-api-access-kdsqz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.783601 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.783678 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.783709 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.885420 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdsqz\" (UniqueName: \"kubernetes.io/projected/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-kube-api-access-kdsqz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.885508 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.885556 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.885588 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.891347 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.894398 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.901318 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.901402 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdsqz\" (UniqueName: \"kubernetes.io/projected/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-kube-api-access-kdsqz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:29 crc kubenswrapper[4975]: I0318 12:38:29.936391 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:30 crc kubenswrapper[4975]: W0318 12:38:30.464778 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43e10e84_5a96_4e1f_add7_c6a1c177b5ce.slice/crio-f6fb8398edf51d44422805b79165e32e7221cdd8c712d2c8af80305d7b6ce2f4 WatchSource:0}: Error finding container f6fb8398edf51d44422805b79165e32e7221cdd8c712d2c8af80305d7b6ce2f4: Status 404 returned error can't find the container with id f6fb8398edf51d44422805b79165e32e7221cdd8c712d2c8af80305d7b6ce2f4 Mar 18 12:38:30 crc kubenswrapper[4975]: I0318 12:38:30.470851 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd"] Mar 18 12:38:31 crc kubenswrapper[4975]: I0318 12:38:31.379598 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" event={"ID":"43e10e84-5a96-4e1f-add7-c6a1c177b5ce","Type":"ContainerStarted","Data":"f6fb8398edf51d44422805b79165e32e7221cdd8c712d2c8af80305d7b6ce2f4"} Mar 18 12:38:31 crc kubenswrapper[4975]: I0318 12:38:31.520137 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 12:38:33 crc kubenswrapper[4975]: I0318 12:38:33.202044 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:33 crc kubenswrapper[4975]: I0318 12:38:33.202377 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:33 crc kubenswrapper[4975]: I0318 12:38:33.254572 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 12:38:33 crc kubenswrapper[4975]: I0318 12:38:33.314615 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:33 crc kubenswrapper[4975]: I0318 12:38:33.479999 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:33 crc kubenswrapper[4975]: I0318 12:38:33.566244 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h9c7v"] Mar 18 12:38:35 crc kubenswrapper[4975]: I0318 12:38:35.432524 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h9c7v" podUID="386cf621-0ac9-43ef-bacc-c00c1b276e63" containerName="registry-server" containerID="cri-o://062fd6d64945ab2df4f6de499514ccaf610d97252d794d93ff2d200182f49dd5" gracePeriod=2 Mar 18 12:38:35 crc kubenswrapper[4975]: I0318 12:38:35.943881 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.117298 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386cf621-0ac9-43ef-bacc-c00c1b276e63-utilities\") pod \"386cf621-0ac9-43ef-bacc-c00c1b276e63\" (UID: \"386cf621-0ac9-43ef-bacc-c00c1b276e63\") " Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.117671 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsfm4\" (UniqueName: \"kubernetes.io/projected/386cf621-0ac9-43ef-bacc-c00c1b276e63-kube-api-access-dsfm4\") pod \"386cf621-0ac9-43ef-bacc-c00c1b276e63\" (UID: \"386cf621-0ac9-43ef-bacc-c00c1b276e63\") " Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.117712 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386cf621-0ac9-43ef-bacc-c00c1b276e63-catalog-content\") pod \"386cf621-0ac9-43ef-bacc-c00c1b276e63\" (UID: \"386cf621-0ac9-43ef-bacc-c00c1b276e63\") " Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.118541 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/386cf621-0ac9-43ef-bacc-c00c1b276e63-utilities" (OuterVolumeSpecName: "utilities") pod "386cf621-0ac9-43ef-bacc-c00c1b276e63" (UID: "386cf621-0ac9-43ef-bacc-c00c1b276e63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.129098 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/386cf621-0ac9-43ef-bacc-c00c1b276e63-kube-api-access-dsfm4" (OuterVolumeSpecName: "kube-api-access-dsfm4") pod "386cf621-0ac9-43ef-bacc-c00c1b276e63" (UID: "386cf621-0ac9-43ef-bacc-c00c1b276e63"). InnerVolumeSpecName "kube-api-access-dsfm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.165983 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/386cf621-0ac9-43ef-bacc-c00c1b276e63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "386cf621-0ac9-43ef-bacc-c00c1b276e63" (UID: "386cf621-0ac9-43ef-bacc-c00c1b276e63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.220140 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386cf621-0ac9-43ef-bacc-c00c1b276e63-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.220177 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsfm4\" (UniqueName: \"kubernetes.io/projected/386cf621-0ac9-43ef-bacc-c00c1b276e63-kube-api-access-dsfm4\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.220188 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386cf621-0ac9-43ef-bacc-c00c1b276e63-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.446967 4975 generic.go:334] "Generic (PLEG): container finished" podID="386cf621-0ac9-43ef-bacc-c00c1b276e63" containerID="062fd6d64945ab2df4f6de499514ccaf610d97252d794d93ff2d200182f49dd5" exitCode=0 Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.447018 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h9c7v" event={"ID":"386cf621-0ac9-43ef-bacc-c00c1b276e63","Type":"ContainerDied","Data":"062fd6d64945ab2df4f6de499514ccaf610d97252d794d93ff2d200182f49dd5"} Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.447034 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h9c7v" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.447051 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h9c7v" event={"ID":"386cf621-0ac9-43ef-bacc-c00c1b276e63","Type":"ContainerDied","Data":"73c3d6ddbf1a7e9d5b676099dccbfdef6e650df48139a56784edef5e14b81722"} Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.447072 4975 scope.go:117] "RemoveContainer" containerID="062fd6d64945ab2df4f6de499514ccaf610d97252d794d93ff2d200182f49dd5" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.476195 4975 scope.go:117] "RemoveContainer" containerID="37259ac03af6ade97903e8d39053b118ef191958f4cee9e6b9982a5c997b8680" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.481272 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h9c7v"] Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.489364 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h9c7v"] Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.504425 4975 scope.go:117] "RemoveContainer" containerID="9c1a41f265c9a2517e72e9d570d19c9a98e33696d8dc8b449752652e6435e3b2" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.552414 4975 scope.go:117] "RemoveContainer" containerID="062fd6d64945ab2df4f6de499514ccaf610d97252d794d93ff2d200182f49dd5" Mar 18 12:38:36 crc kubenswrapper[4975]: E0318 12:38:36.553041 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"062fd6d64945ab2df4f6de499514ccaf610d97252d794d93ff2d200182f49dd5\": container with ID starting with 062fd6d64945ab2df4f6de499514ccaf610d97252d794d93ff2d200182f49dd5 not found: ID does not exist" containerID="062fd6d64945ab2df4f6de499514ccaf610d97252d794d93ff2d200182f49dd5" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.553086 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"062fd6d64945ab2df4f6de499514ccaf610d97252d794d93ff2d200182f49dd5"} err="failed to get container status \"062fd6d64945ab2df4f6de499514ccaf610d97252d794d93ff2d200182f49dd5\": rpc error: code = NotFound desc = could not find container \"062fd6d64945ab2df4f6de499514ccaf610d97252d794d93ff2d200182f49dd5\": container with ID starting with 062fd6d64945ab2df4f6de499514ccaf610d97252d794d93ff2d200182f49dd5 not found: ID does not exist" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.553111 4975 scope.go:117] "RemoveContainer" containerID="37259ac03af6ade97903e8d39053b118ef191958f4cee9e6b9982a5c997b8680" Mar 18 12:38:36 crc kubenswrapper[4975]: E0318 12:38:36.553499 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37259ac03af6ade97903e8d39053b118ef191958f4cee9e6b9982a5c997b8680\": container with ID starting with 37259ac03af6ade97903e8d39053b118ef191958f4cee9e6b9982a5c997b8680 not found: ID does not exist" containerID="37259ac03af6ade97903e8d39053b118ef191958f4cee9e6b9982a5c997b8680" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.553529 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37259ac03af6ade97903e8d39053b118ef191958f4cee9e6b9982a5c997b8680"} err="failed to get container status \"37259ac03af6ade97903e8d39053b118ef191958f4cee9e6b9982a5c997b8680\": rpc error: code = NotFound desc = could not find container \"37259ac03af6ade97903e8d39053b118ef191958f4cee9e6b9982a5c997b8680\": container with ID starting with 37259ac03af6ade97903e8d39053b118ef191958f4cee9e6b9982a5c997b8680 not found: ID does not exist" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.553546 4975 scope.go:117] "RemoveContainer" containerID="9c1a41f265c9a2517e72e9d570d19c9a98e33696d8dc8b449752652e6435e3b2" Mar 18 12:38:36 crc kubenswrapper[4975]: E0318 12:38:36.554027 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c1a41f265c9a2517e72e9d570d19c9a98e33696d8dc8b449752652e6435e3b2\": container with ID starting with 9c1a41f265c9a2517e72e9d570d19c9a98e33696d8dc8b449752652e6435e3b2 not found: ID does not exist" containerID="9c1a41f265c9a2517e72e9d570d19c9a98e33696d8dc8b449752652e6435e3b2" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.554053 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c1a41f265c9a2517e72e9d570d19c9a98e33696d8dc8b449752652e6435e3b2"} err="failed to get container status \"9c1a41f265c9a2517e72e9d570d19c9a98e33696d8dc8b449752652e6435e3b2\": rpc error: code = NotFound desc = could not find container \"9c1a41f265c9a2517e72e9d570d19c9a98e33696d8dc8b449752652e6435e3b2\": container with ID starting with 9c1a41f265c9a2517e72e9d570d19c9a98e33696d8dc8b449752652e6435e3b2 not found: ID does not exist" Mar 18 12:38:36 crc kubenswrapper[4975]: I0318 12:38:36.694676 4975 scope.go:117] "RemoveContainer" containerID="7092497515fb9f1f8f2eacfb88960fe8f9b45454769a5e47d8591f9e8531e8f6" Mar 18 12:38:37 crc kubenswrapper[4975]: I0318 12:38:37.018108 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:38:37 crc kubenswrapper[4975]: E0318 12:38:37.018536 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:38:37 crc kubenswrapper[4975]: I0318 12:38:37.032057 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="386cf621-0ac9-43ef-bacc-c00c1b276e63" path="/var/lib/kubelet/pods/386cf621-0ac9-43ef-bacc-c00c1b276e63/volumes" Mar 18 12:38:44 crc kubenswrapper[4975]: I0318 12:38:44.564946 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" event={"ID":"43e10e84-5a96-4e1f-add7-c6a1c177b5ce","Type":"ContainerStarted","Data":"7aa25df391b29af591fa234a90f63b97427b501584a8b15ec3a02a0dd2ab49ee"} Mar 18 12:38:44 crc kubenswrapper[4975]: I0318 12:38:44.581963 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" podStartSLOduration=1.9241961440000002 podStartE2EDuration="15.581936998s" podCreationTimestamp="2026-03-18 12:38:29 +0000 UTC" firstStartedPulling="2026-03-18 12:38:30.466893562 +0000 UTC m=+1696.181294151" lastFinishedPulling="2026-03-18 12:38:44.124634406 +0000 UTC m=+1709.839035005" observedRunningTime="2026-03-18 12:38:44.580673394 +0000 UTC m=+1710.295073973" watchObservedRunningTime="2026-03-18 12:38:44.581936998 +0000 UTC m=+1710.296337877" Mar 18 12:38:49 crc kubenswrapper[4975]: I0318 12:38:49.017445 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:38:49 crc kubenswrapper[4975]: E0318 12:38:49.018241 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:38:56 crc kubenswrapper[4975]: I0318 12:38:56.709909 4975 generic.go:334] "Generic (PLEG): container finished" podID="43e10e84-5a96-4e1f-add7-c6a1c177b5ce" containerID="7aa25df391b29af591fa234a90f63b97427b501584a8b15ec3a02a0dd2ab49ee" exitCode=0 Mar 18 12:38:56 crc kubenswrapper[4975]: I0318 12:38:56.710034 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" event={"ID":"43e10e84-5a96-4e1f-add7-c6a1c177b5ce","Type":"ContainerDied","Data":"7aa25df391b29af591fa234a90f63b97427b501584a8b15ec3a02a0dd2ab49ee"} Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.159856 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.184431 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-inventory\") pod \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.184589 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-ssh-key-openstack-edpm-ipam\") pod \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.184672 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-repo-setup-combined-ca-bundle\") pod \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.184722 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdsqz\" (UniqueName: \"kubernetes.io/projected/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-kube-api-access-kdsqz\") pod \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\" (UID: \"43e10e84-5a96-4e1f-add7-c6a1c177b5ce\") " Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.190798 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-kube-api-access-kdsqz" (OuterVolumeSpecName: "kube-api-access-kdsqz") pod "43e10e84-5a96-4e1f-add7-c6a1c177b5ce" (UID: "43e10e84-5a96-4e1f-add7-c6a1c177b5ce"). InnerVolumeSpecName "kube-api-access-kdsqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.197049 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "43e10e84-5a96-4e1f-add7-c6a1c177b5ce" (UID: "43e10e84-5a96-4e1f-add7-c6a1c177b5ce"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.217156 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-inventory" (OuterVolumeSpecName: "inventory") pod "43e10e84-5a96-4e1f-add7-c6a1c177b5ce" (UID: "43e10e84-5a96-4e1f-add7-c6a1c177b5ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.219549 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "43e10e84-5a96-4e1f-add7-c6a1c177b5ce" (UID: "43e10e84-5a96-4e1f-add7-c6a1c177b5ce"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.286649 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdsqz\" (UniqueName: \"kubernetes.io/projected/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-kube-api-access-kdsqz\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.286690 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.286703 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.286713 4975 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e10e84-5a96-4e1f-add7-c6a1c177b5ce-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.734250 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" event={"ID":"43e10e84-5a96-4e1f-add7-c6a1c177b5ce","Type":"ContainerDied","Data":"f6fb8398edf51d44422805b79165e32e7221cdd8c712d2c8af80305d7b6ce2f4"} Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.734295 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6fb8398edf51d44422805b79165e32e7221cdd8c712d2c8af80305d7b6ce2f4" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.734339 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.850636 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk"] Mar 18 12:38:58 crc kubenswrapper[4975]: E0318 12:38:58.851118 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386cf621-0ac9-43ef-bacc-c00c1b276e63" containerName="extract-utilities" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.851137 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="386cf621-0ac9-43ef-bacc-c00c1b276e63" containerName="extract-utilities" Mar 18 12:38:58 crc kubenswrapper[4975]: E0318 12:38:58.851179 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43e10e84-5a96-4e1f-add7-c6a1c177b5ce" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.851187 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="43e10e84-5a96-4e1f-add7-c6a1c177b5ce" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 12:38:58 crc kubenswrapper[4975]: E0318 12:38:58.851201 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386cf621-0ac9-43ef-bacc-c00c1b276e63" containerName="registry-server" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.851207 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="386cf621-0ac9-43ef-bacc-c00c1b276e63" containerName="registry-server" Mar 18 12:38:58 crc kubenswrapper[4975]: E0318 12:38:58.851218 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386cf621-0ac9-43ef-bacc-c00c1b276e63" containerName="extract-content" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.851224 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="386cf621-0ac9-43ef-bacc-c00c1b276e63" containerName="extract-content" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.851389 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="43e10e84-5a96-4e1f-add7-c6a1c177b5ce" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.851405 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="386cf621-0ac9-43ef-bacc-c00c1b276e63" containerName="registry-server" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.852100 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.853701 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.854656 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.854936 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.855133 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.861947 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk"] Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.896544 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c62f110-2c2e-4de8-a425-0b08794eb28f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m5mvk\" (UID: \"2c62f110-2c2e-4de8-a425-0b08794eb28f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.896624 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk5tp\" (UniqueName: \"kubernetes.io/projected/2c62f110-2c2e-4de8-a425-0b08794eb28f-kube-api-access-zk5tp\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m5mvk\" (UID: \"2c62f110-2c2e-4de8-a425-0b08794eb28f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.896786 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c62f110-2c2e-4de8-a425-0b08794eb28f-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m5mvk\" (UID: \"2c62f110-2c2e-4de8-a425-0b08794eb28f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.998513 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk5tp\" (UniqueName: \"kubernetes.io/projected/2c62f110-2c2e-4de8-a425-0b08794eb28f-kube-api-access-zk5tp\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m5mvk\" (UID: \"2c62f110-2c2e-4de8-a425-0b08794eb28f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.998617 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c62f110-2c2e-4de8-a425-0b08794eb28f-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m5mvk\" (UID: \"2c62f110-2c2e-4de8-a425-0b08794eb28f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" Mar 18 12:38:58 crc kubenswrapper[4975]: I0318 12:38:58.998820 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c62f110-2c2e-4de8-a425-0b08794eb28f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m5mvk\" (UID: \"2c62f110-2c2e-4de8-a425-0b08794eb28f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" Mar 18 12:38:59 crc kubenswrapper[4975]: I0318 12:38:59.003091 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c62f110-2c2e-4de8-a425-0b08794eb28f-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m5mvk\" (UID: \"2c62f110-2c2e-4de8-a425-0b08794eb28f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" Mar 18 12:38:59 crc kubenswrapper[4975]: I0318 12:38:59.003275 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c62f110-2c2e-4de8-a425-0b08794eb28f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m5mvk\" (UID: \"2c62f110-2c2e-4de8-a425-0b08794eb28f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" Mar 18 12:38:59 crc kubenswrapper[4975]: I0318 12:38:59.015456 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk5tp\" (UniqueName: \"kubernetes.io/projected/2c62f110-2c2e-4de8-a425-0b08794eb28f-kube-api-access-zk5tp\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m5mvk\" (UID: \"2c62f110-2c2e-4de8-a425-0b08794eb28f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" Mar 18 12:38:59 crc kubenswrapper[4975]: I0318 12:38:59.175773 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" Mar 18 12:38:59 crc kubenswrapper[4975]: I0318 12:38:59.762243 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk"] Mar 18 12:39:00 crc kubenswrapper[4975]: I0318 12:39:00.757406 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" event={"ID":"2c62f110-2c2e-4de8-a425-0b08794eb28f","Type":"ContainerStarted","Data":"44ed5d174d2a4c23d7f6e46b3da9876fc87607717c975531836ad139f3cecff0"} Mar 18 12:39:00 crc kubenswrapper[4975]: I0318 12:39:00.757777 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" event={"ID":"2c62f110-2c2e-4de8-a425-0b08794eb28f","Type":"ContainerStarted","Data":"4843859c0b9392aae0154344ab18c0f61410277ccf3f9d272066214e6092b291"} Mar 18 12:39:00 crc kubenswrapper[4975]: I0318 12:39:00.794364 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" podStartSLOduration=2.5873969 podStartE2EDuration="2.794337847s" podCreationTimestamp="2026-03-18 12:38:58 +0000 UTC" firstStartedPulling="2026-03-18 12:38:59.770036568 +0000 UTC m=+1725.484437147" lastFinishedPulling="2026-03-18 12:38:59.976977515 +0000 UTC m=+1725.691378094" observedRunningTime="2026-03-18 12:39:00.77948169 +0000 UTC m=+1726.493882279" watchObservedRunningTime="2026-03-18 12:39:00.794337847 +0000 UTC m=+1726.508738466" Mar 18 12:39:01 crc kubenswrapper[4975]: I0318 12:39:01.017285 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:39:01 crc kubenswrapper[4975]: E0318 12:39:01.017553 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:39:02 crc kubenswrapper[4975]: I0318 12:39:02.785657 4975 generic.go:334] "Generic (PLEG): container finished" podID="2c62f110-2c2e-4de8-a425-0b08794eb28f" containerID="44ed5d174d2a4c23d7f6e46b3da9876fc87607717c975531836ad139f3cecff0" exitCode=0 Mar 18 12:39:02 crc kubenswrapper[4975]: I0318 12:39:02.785745 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" event={"ID":"2c62f110-2c2e-4de8-a425-0b08794eb28f","Type":"ContainerDied","Data":"44ed5d174d2a4c23d7f6e46b3da9876fc87607717c975531836ad139f3cecff0"} Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.285790 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.419946 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c62f110-2c2e-4de8-a425-0b08794eb28f-ssh-key-openstack-edpm-ipam\") pod \"2c62f110-2c2e-4de8-a425-0b08794eb28f\" (UID: \"2c62f110-2c2e-4de8-a425-0b08794eb28f\") " Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.420066 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk5tp\" (UniqueName: \"kubernetes.io/projected/2c62f110-2c2e-4de8-a425-0b08794eb28f-kube-api-access-zk5tp\") pod \"2c62f110-2c2e-4de8-a425-0b08794eb28f\" (UID: \"2c62f110-2c2e-4de8-a425-0b08794eb28f\") " Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.420111 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c62f110-2c2e-4de8-a425-0b08794eb28f-inventory\") pod \"2c62f110-2c2e-4de8-a425-0b08794eb28f\" (UID: \"2c62f110-2c2e-4de8-a425-0b08794eb28f\") " Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.425758 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c62f110-2c2e-4de8-a425-0b08794eb28f-kube-api-access-zk5tp" (OuterVolumeSpecName: "kube-api-access-zk5tp") pod "2c62f110-2c2e-4de8-a425-0b08794eb28f" (UID: "2c62f110-2c2e-4de8-a425-0b08794eb28f"). InnerVolumeSpecName "kube-api-access-zk5tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.451707 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c62f110-2c2e-4de8-a425-0b08794eb28f-inventory" (OuterVolumeSpecName: "inventory") pod "2c62f110-2c2e-4de8-a425-0b08794eb28f" (UID: "2c62f110-2c2e-4de8-a425-0b08794eb28f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.455174 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c62f110-2c2e-4de8-a425-0b08794eb28f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2c62f110-2c2e-4de8-a425-0b08794eb28f" (UID: "2c62f110-2c2e-4de8-a425-0b08794eb28f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.522598 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk5tp\" (UniqueName: \"kubernetes.io/projected/2c62f110-2c2e-4de8-a425-0b08794eb28f-kube-api-access-zk5tp\") on node \"crc\" DevicePath \"\"" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.522629 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c62f110-2c2e-4de8-a425-0b08794eb28f-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.522639 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c62f110-2c2e-4de8-a425-0b08794eb28f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.821179 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" event={"ID":"2c62f110-2c2e-4de8-a425-0b08794eb28f","Type":"ContainerDied","Data":"4843859c0b9392aae0154344ab18c0f61410277ccf3f9d272066214e6092b291"} Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.821280 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4843859c0b9392aae0154344ab18c0f61410277ccf3f9d272066214e6092b291" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.821394 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m5mvk" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.921626 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd"] Mar 18 12:39:04 crc kubenswrapper[4975]: E0318 12:39:04.922302 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c62f110-2c2e-4de8-a425-0b08794eb28f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.922336 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c62f110-2c2e-4de8-a425-0b08794eb28f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.922692 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c62f110-2c2e-4de8-a425-0b08794eb28f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.923743 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.926242 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.926525 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.936197 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd"] Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.949334 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:39:04 crc kubenswrapper[4975]: I0318 12:39:04.949357 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:39:05 crc kubenswrapper[4975]: I0318 12:39:05.051700 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:39:05 crc kubenswrapper[4975]: I0318 12:39:05.052072 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:39:05 crc kubenswrapper[4975]: I0318 12:39:05.052104 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc2mv\" (UniqueName: \"kubernetes.io/projected/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-kube-api-access-bc2mv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:39:05 crc kubenswrapper[4975]: I0318 12:39:05.052264 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:39:05 crc kubenswrapper[4975]: I0318 12:39:05.154225 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc2mv\" (UniqueName: \"kubernetes.io/projected/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-kube-api-access-bc2mv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:39:05 crc kubenswrapper[4975]: I0318 12:39:05.154393 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:39:05 crc kubenswrapper[4975]: I0318 12:39:05.154527 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:39:05 crc kubenswrapper[4975]: I0318 12:39:05.154634 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:39:05 crc kubenswrapper[4975]: I0318 12:39:05.158536 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:39:05 crc kubenswrapper[4975]: I0318 12:39:05.160579 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:39:05 crc kubenswrapper[4975]: I0318 12:39:05.161996 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:39:05 crc kubenswrapper[4975]: I0318 12:39:05.179557 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc2mv\" (UniqueName: \"kubernetes.io/projected/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-kube-api-access-bc2mv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:39:05 crc kubenswrapper[4975]: I0318 12:39:05.272491 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:39:05 crc kubenswrapper[4975]: I0318 12:39:05.847451 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd"] Mar 18 12:39:05 crc kubenswrapper[4975]: W0318 12:39:05.854461 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod296e864d_5a95_4c7a_b9ea_3d18bb1dfdcd.slice/crio-aced124078e8b32690e90a39901281e481ecebddda26eb5a7e51587025c45ffd WatchSource:0}: Error finding container aced124078e8b32690e90a39901281e481ecebddda26eb5a7e51587025c45ffd: Status 404 returned error can't find the container with id aced124078e8b32690e90a39901281e481ecebddda26eb5a7e51587025c45ffd Mar 18 12:39:06 crc kubenswrapper[4975]: I0318 12:39:06.844943 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" event={"ID":"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd","Type":"ContainerStarted","Data":"5a4b072b79bb29a1fbdeecdf177f36476867bafcf980414975bd3a448882f755"} Mar 18 12:39:06 crc kubenswrapper[4975]: I0318 12:39:06.845001 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" event={"ID":"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd","Type":"ContainerStarted","Data":"aced124078e8b32690e90a39901281e481ecebddda26eb5a7e51587025c45ffd"} Mar 18 12:39:15 crc kubenswrapper[4975]: I0318 12:39:15.030803 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:39:15 crc kubenswrapper[4975]: E0318 12:39:15.032097 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:39:26 crc kubenswrapper[4975]: I0318 12:39:26.016526 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:39:26 crc kubenswrapper[4975]: E0318 12:39:26.017851 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:39:38 crc kubenswrapper[4975]: I0318 12:39:38.017505 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:39:38 crc kubenswrapper[4975]: E0318 12:39:38.018465 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:39:44 crc kubenswrapper[4975]: I0318 12:39:44.135954 4975 scope.go:117] "RemoveContainer" containerID="f9933a6af15b43d91daaeb254728f8dfe22668fead78e658541acd0a5d81bf9b" Mar 18 12:39:44 crc kubenswrapper[4975]: I0318 12:39:44.170458 4975 scope.go:117] "RemoveContainer" containerID="e485a5545c716abfab08afcd79fe927a554e6600a0b95914b16c2582e0c78313" Mar 18 12:39:44 crc kubenswrapper[4975]: I0318 12:39:44.209819 4975 scope.go:117] "RemoveContainer" containerID="ce1114177ebe46aecae8af1d17b80f81d5eaea51273cd2f9feac414d88a4ce2c" Mar 18 12:39:44 crc kubenswrapper[4975]: I0318 12:39:44.265343 4975 scope.go:117] "RemoveContainer" containerID="df65c753ec03e658f829bd79d5281c49156d7666c60a74f064d6e8a1c6e09b15" Mar 18 12:39:44 crc kubenswrapper[4975]: I0318 12:39:44.305534 4975 scope.go:117] "RemoveContainer" containerID="0c0bd55712354453b90292fa5a312a6eafba86a9a67e8a425b79178102fd066e" Mar 18 12:39:51 crc kubenswrapper[4975]: I0318 12:39:51.016685 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:39:51 crc kubenswrapper[4975]: E0318 12:39:51.018314 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:40:00 crc kubenswrapper[4975]: I0318 12:40:00.177425 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" podStartSLOduration=55.997010211 podStartE2EDuration="56.17737889s" podCreationTimestamp="2026-03-18 12:39:04 +0000 UTC" firstStartedPulling="2026-03-18 12:39:05.861937229 +0000 UTC m=+1731.576337828" lastFinishedPulling="2026-03-18 12:39:06.042305918 +0000 UTC m=+1731.756706507" observedRunningTime="2026-03-18 12:39:06.870163737 +0000 UTC m=+1732.584564336" watchObservedRunningTime="2026-03-18 12:40:00.17737889 +0000 UTC m=+1785.891779479" Mar 18 12:40:00 crc kubenswrapper[4975]: I0318 12:40:00.184802 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563960-6mgcg"] Mar 18 12:40:00 crc kubenswrapper[4975]: I0318 12:40:00.186607 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563960-6mgcg" Mar 18 12:40:00 crc kubenswrapper[4975]: I0318 12:40:00.193172 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:40:00 crc kubenswrapper[4975]: I0318 12:40:00.193266 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:40:00 crc kubenswrapper[4975]: I0318 12:40:00.193523 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:40:00 crc kubenswrapper[4975]: I0318 12:40:00.200361 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563960-6mgcg"] Mar 18 12:40:00 crc kubenswrapper[4975]: I0318 12:40:00.364799 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp2bg\" (UniqueName: \"kubernetes.io/projected/7a9093ac-3b95-4bca-9455-ccf2768d9da2-kube-api-access-kp2bg\") pod \"auto-csr-approver-29563960-6mgcg\" (UID: \"7a9093ac-3b95-4bca-9455-ccf2768d9da2\") " pod="openshift-infra/auto-csr-approver-29563960-6mgcg" Mar 18 12:40:00 crc kubenswrapper[4975]: I0318 12:40:00.467445 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp2bg\" (UniqueName: \"kubernetes.io/projected/7a9093ac-3b95-4bca-9455-ccf2768d9da2-kube-api-access-kp2bg\") pod \"auto-csr-approver-29563960-6mgcg\" (UID: \"7a9093ac-3b95-4bca-9455-ccf2768d9da2\") " pod="openshift-infra/auto-csr-approver-29563960-6mgcg" Mar 18 12:40:00 crc kubenswrapper[4975]: I0318 12:40:00.496843 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp2bg\" (UniqueName: \"kubernetes.io/projected/7a9093ac-3b95-4bca-9455-ccf2768d9da2-kube-api-access-kp2bg\") pod \"auto-csr-approver-29563960-6mgcg\" (UID: \"7a9093ac-3b95-4bca-9455-ccf2768d9da2\") " pod="openshift-infra/auto-csr-approver-29563960-6mgcg" Mar 18 12:40:00 crc kubenswrapper[4975]: I0318 12:40:00.510560 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563960-6mgcg" Mar 18 12:40:00 crc kubenswrapper[4975]: I0318 12:40:00.985372 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563960-6mgcg"] Mar 18 12:40:01 crc kubenswrapper[4975]: I0318 12:40:01.423538 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563960-6mgcg" event={"ID":"7a9093ac-3b95-4bca-9455-ccf2768d9da2","Type":"ContainerStarted","Data":"0169c9f9972c45eddcbdd9b7ba09ef5fa353eb53aa42966a1e853bf012b48c2c"} Mar 18 12:40:02 crc kubenswrapper[4975]: I0318 12:40:02.016393 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:40:02 crc kubenswrapper[4975]: E0318 12:40:02.016741 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:40:03 crc kubenswrapper[4975]: I0318 12:40:03.442942 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563960-6mgcg" event={"ID":"7a9093ac-3b95-4bca-9455-ccf2768d9da2","Type":"ContainerStarted","Data":"fb7f4f09936e5e6dc05002d41147fc0603b7a6cec9299ef6692fa873c5ffc049"} Mar 18 12:40:03 crc kubenswrapper[4975]: I0318 12:40:03.464404 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563960-6mgcg" podStartSLOduration=1.314515767 podStartE2EDuration="3.464384806s" podCreationTimestamp="2026-03-18 12:40:00 +0000 UTC" firstStartedPulling="2026-03-18 12:40:00.995707868 +0000 UTC m=+1786.710108447" lastFinishedPulling="2026-03-18 12:40:03.145576897 +0000 UTC m=+1788.859977486" observedRunningTime="2026-03-18 12:40:03.457863128 +0000 UTC m=+1789.172263707" watchObservedRunningTime="2026-03-18 12:40:03.464384806 +0000 UTC m=+1789.178785385" Mar 18 12:40:04 crc kubenswrapper[4975]: I0318 12:40:04.461100 4975 generic.go:334] "Generic (PLEG): container finished" podID="7a9093ac-3b95-4bca-9455-ccf2768d9da2" containerID="fb7f4f09936e5e6dc05002d41147fc0603b7a6cec9299ef6692fa873c5ffc049" exitCode=0 Mar 18 12:40:04 crc kubenswrapper[4975]: I0318 12:40:04.461262 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563960-6mgcg" event={"ID":"7a9093ac-3b95-4bca-9455-ccf2768d9da2","Type":"ContainerDied","Data":"fb7f4f09936e5e6dc05002d41147fc0603b7a6cec9299ef6692fa873c5ffc049"} Mar 18 12:40:05 crc kubenswrapper[4975]: I0318 12:40:05.819531 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563960-6mgcg" Mar 18 12:40:05 crc kubenswrapper[4975]: I0318 12:40:05.972958 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp2bg\" (UniqueName: \"kubernetes.io/projected/7a9093ac-3b95-4bca-9455-ccf2768d9da2-kube-api-access-kp2bg\") pod \"7a9093ac-3b95-4bca-9455-ccf2768d9da2\" (UID: \"7a9093ac-3b95-4bca-9455-ccf2768d9da2\") " Mar 18 12:40:05 crc kubenswrapper[4975]: I0318 12:40:05.981263 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9093ac-3b95-4bca-9455-ccf2768d9da2-kube-api-access-kp2bg" (OuterVolumeSpecName: "kube-api-access-kp2bg") pod "7a9093ac-3b95-4bca-9455-ccf2768d9da2" (UID: "7a9093ac-3b95-4bca-9455-ccf2768d9da2"). InnerVolumeSpecName "kube-api-access-kp2bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:40:06 crc kubenswrapper[4975]: I0318 12:40:06.076211 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp2bg\" (UniqueName: \"kubernetes.io/projected/7a9093ac-3b95-4bca-9455-ccf2768d9da2-kube-api-access-kp2bg\") on node \"crc\" DevicePath \"\"" Mar 18 12:40:06 crc kubenswrapper[4975]: I0318 12:40:06.526667 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563960-6mgcg" event={"ID":"7a9093ac-3b95-4bca-9455-ccf2768d9da2","Type":"ContainerDied","Data":"0169c9f9972c45eddcbdd9b7ba09ef5fa353eb53aa42966a1e853bf012b48c2c"} Mar 18 12:40:06 crc kubenswrapper[4975]: I0318 12:40:06.526738 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0169c9f9972c45eddcbdd9b7ba09ef5fa353eb53aa42966a1e853bf012b48c2c" Mar 18 12:40:06 crc kubenswrapper[4975]: I0318 12:40:06.526814 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563960-6mgcg" Mar 18 12:40:06 crc kubenswrapper[4975]: I0318 12:40:06.557495 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563954-6tpj8"] Mar 18 12:40:06 crc kubenswrapper[4975]: I0318 12:40:06.568343 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563954-6tpj8"] Mar 18 12:40:07 crc kubenswrapper[4975]: I0318 12:40:07.029074 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e746cdd9-ae57-4577-a9c5-eafc0aa28c09" path="/var/lib/kubelet/pods/e746cdd9-ae57-4577-a9c5-eafc0aa28c09/volumes" Mar 18 12:40:17 crc kubenswrapper[4975]: I0318 12:40:17.017693 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:40:17 crc kubenswrapper[4975]: E0318 12:40:17.018970 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:40:30 crc kubenswrapper[4975]: I0318 12:40:30.019054 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:40:30 crc kubenswrapper[4975]: E0318 12:40:30.019790 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:40:44 crc kubenswrapper[4975]: I0318 12:40:44.411758 4975 scope.go:117] "RemoveContainer" containerID="278cdef0370e38cfc3e1a8277cf552974323a7b704059e143c2185c932a7dcfe" Mar 18 12:40:44 crc kubenswrapper[4975]: I0318 12:40:44.468376 4975 scope.go:117] "RemoveContainer" containerID="6b28ba84cab9eb6dd833f8a2c43c58ed4371a9c34e96c677b896512ab2e431d3" Mar 18 12:40:44 crc kubenswrapper[4975]: I0318 12:40:44.519159 4975 scope.go:117] "RemoveContainer" containerID="642e2852d16bd31017d771c002487e405e4cae85dfc9dd86dd5a68b60b072411" Mar 18 12:40:44 crc kubenswrapper[4975]: I0318 12:40:44.547405 4975 scope.go:117] "RemoveContainer" containerID="f8b378d86645ee66d48fd1b006bf435578d9a300ec9db8a1a7f64c30f51792f3" Mar 18 12:40:44 crc kubenswrapper[4975]: I0318 12:40:44.574951 4975 scope.go:117] "RemoveContainer" containerID="085c3bcf8fe1108084d797f0f69756fc886932b4a0540f844652adcd1deb2117" Mar 18 12:40:45 crc kubenswrapper[4975]: I0318 12:40:45.024293 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:40:45 crc kubenswrapper[4975]: E0318 12:40:45.025341 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:40:58 crc kubenswrapper[4975]: I0318 12:40:58.017086 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:40:58 crc kubenswrapper[4975]: E0318 12:40:58.017823 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:41:12 crc kubenswrapper[4975]: I0318 12:41:12.016550 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:41:12 crc kubenswrapper[4975]: E0318 12:41:12.017403 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:41:26 crc kubenswrapper[4975]: I0318 12:41:26.017130 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:41:26 crc kubenswrapper[4975]: I0318 12:41:26.398211 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"e1445fab85886838c0ceb2338548e8dbe5d869da1c9483fa6052e6d794e0e6d2"} Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.004451 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gtn72"] Mar 18 12:41:42 crc kubenswrapper[4975]: E0318 12:41:42.005716 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9093ac-3b95-4bca-9455-ccf2768d9da2" containerName="oc" Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.005739 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9093ac-3b95-4bca-9455-ccf2768d9da2" containerName="oc" Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.006153 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a9093ac-3b95-4bca-9455-ccf2768d9da2" containerName="oc" Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.008523 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.021755 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gtn72"] Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.098526 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d90357-cf04-4517-8359-a910ea68d7ad-catalog-content\") pod \"redhat-operators-gtn72\" (UID: \"68d90357-cf04-4517-8359-a910ea68d7ad\") " pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.098662 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hxkf\" (UniqueName: \"kubernetes.io/projected/68d90357-cf04-4517-8359-a910ea68d7ad-kube-api-access-8hxkf\") pod \"redhat-operators-gtn72\" (UID: \"68d90357-cf04-4517-8359-a910ea68d7ad\") " pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.098748 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d90357-cf04-4517-8359-a910ea68d7ad-utilities\") pod \"redhat-operators-gtn72\" (UID: \"68d90357-cf04-4517-8359-a910ea68d7ad\") " pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.200806 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d90357-cf04-4517-8359-a910ea68d7ad-catalog-content\") pod \"redhat-operators-gtn72\" (UID: \"68d90357-cf04-4517-8359-a910ea68d7ad\") " pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.201227 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hxkf\" (UniqueName: \"kubernetes.io/projected/68d90357-cf04-4517-8359-a910ea68d7ad-kube-api-access-8hxkf\") pod \"redhat-operators-gtn72\" (UID: \"68d90357-cf04-4517-8359-a910ea68d7ad\") " pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.201403 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d90357-cf04-4517-8359-a910ea68d7ad-catalog-content\") pod \"redhat-operators-gtn72\" (UID: \"68d90357-cf04-4517-8359-a910ea68d7ad\") " pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.201527 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d90357-cf04-4517-8359-a910ea68d7ad-utilities\") pod \"redhat-operators-gtn72\" (UID: \"68d90357-cf04-4517-8359-a910ea68d7ad\") " pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.201910 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d90357-cf04-4517-8359-a910ea68d7ad-utilities\") pod \"redhat-operators-gtn72\" (UID: \"68d90357-cf04-4517-8359-a910ea68d7ad\") " pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.230159 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hxkf\" (UniqueName: \"kubernetes.io/projected/68d90357-cf04-4517-8359-a910ea68d7ad-kube-api-access-8hxkf\") pod \"redhat-operators-gtn72\" (UID: \"68d90357-cf04-4517-8359-a910ea68d7ad\") " pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.324846 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:41:42 crc kubenswrapper[4975]: I0318 12:41:42.853943 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gtn72"] Mar 18 12:41:43 crc kubenswrapper[4975]: I0318 12:41:43.565623 4975 generic.go:334] "Generic (PLEG): container finished" podID="68d90357-cf04-4517-8359-a910ea68d7ad" containerID="67a9612fb3ff592bb466f5e6d62cc8b2a6ab38558c7b46d41b38686171de38ca" exitCode=0 Mar 18 12:41:43 crc kubenswrapper[4975]: I0318 12:41:43.565719 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtn72" event={"ID":"68d90357-cf04-4517-8359-a910ea68d7ad","Type":"ContainerDied","Data":"67a9612fb3ff592bb466f5e6d62cc8b2a6ab38558c7b46d41b38686171de38ca"} Mar 18 12:41:43 crc kubenswrapper[4975]: I0318 12:41:43.565992 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtn72" event={"ID":"68d90357-cf04-4517-8359-a910ea68d7ad","Type":"ContainerStarted","Data":"78d90c83735e2205948a43de5d97c87aa2e5c007968e7b250a62c1a9376975f2"} Mar 18 12:41:43 crc kubenswrapper[4975]: I0318 12:41:43.567880 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:41:45 crc kubenswrapper[4975]: I0318 12:41:45.590834 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtn72" event={"ID":"68d90357-cf04-4517-8359-a910ea68d7ad","Type":"ContainerStarted","Data":"bb80e2c4005ddc2c569f755f9e2b2446c665c10f133ded085bc3a5d87fcd272c"} Mar 18 12:41:46 crc kubenswrapper[4975]: I0318 12:41:46.580639 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dltlf"] Mar 18 12:41:46 crc kubenswrapper[4975]: I0318 12:41:46.584176 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:41:46 crc kubenswrapper[4975]: I0318 12:41:46.598632 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dltlf"] Mar 18 12:41:46 crc kubenswrapper[4975]: I0318 12:41:46.686899 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63fa7cd-428d-4f0b-8314-16626ff2321d-utilities\") pod \"certified-operators-dltlf\" (UID: \"f63fa7cd-428d-4f0b-8314-16626ff2321d\") " pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:41:46 crc kubenswrapper[4975]: I0318 12:41:46.686959 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63fa7cd-428d-4f0b-8314-16626ff2321d-catalog-content\") pod \"certified-operators-dltlf\" (UID: \"f63fa7cd-428d-4f0b-8314-16626ff2321d\") " pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:41:46 crc kubenswrapper[4975]: I0318 12:41:46.687197 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5gl8\" (UniqueName: \"kubernetes.io/projected/f63fa7cd-428d-4f0b-8314-16626ff2321d-kube-api-access-s5gl8\") pod \"certified-operators-dltlf\" (UID: \"f63fa7cd-428d-4f0b-8314-16626ff2321d\") " pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:41:46 crc kubenswrapper[4975]: I0318 12:41:46.789248 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63fa7cd-428d-4f0b-8314-16626ff2321d-utilities\") pod \"certified-operators-dltlf\" (UID: \"f63fa7cd-428d-4f0b-8314-16626ff2321d\") " pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:41:46 crc kubenswrapper[4975]: I0318 12:41:46.789316 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63fa7cd-428d-4f0b-8314-16626ff2321d-catalog-content\") pod \"certified-operators-dltlf\" (UID: \"f63fa7cd-428d-4f0b-8314-16626ff2321d\") " pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:41:46 crc kubenswrapper[4975]: I0318 12:41:46.789375 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5gl8\" (UniqueName: \"kubernetes.io/projected/f63fa7cd-428d-4f0b-8314-16626ff2321d-kube-api-access-s5gl8\") pod \"certified-operators-dltlf\" (UID: \"f63fa7cd-428d-4f0b-8314-16626ff2321d\") " pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:41:46 crc kubenswrapper[4975]: I0318 12:41:46.790059 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63fa7cd-428d-4f0b-8314-16626ff2321d-utilities\") pod \"certified-operators-dltlf\" (UID: \"f63fa7cd-428d-4f0b-8314-16626ff2321d\") " pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:41:46 crc kubenswrapper[4975]: I0318 12:41:46.790107 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63fa7cd-428d-4f0b-8314-16626ff2321d-catalog-content\") pod \"certified-operators-dltlf\" (UID: \"f63fa7cd-428d-4f0b-8314-16626ff2321d\") " pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:41:46 crc kubenswrapper[4975]: I0318 12:41:46.808585 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5gl8\" (UniqueName: \"kubernetes.io/projected/f63fa7cd-428d-4f0b-8314-16626ff2321d-kube-api-access-s5gl8\") pod \"certified-operators-dltlf\" (UID: \"f63fa7cd-428d-4f0b-8314-16626ff2321d\") " pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:41:46 crc kubenswrapper[4975]: I0318 12:41:46.912550 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:41:47 crc kubenswrapper[4975]: I0318 12:41:47.205791 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dltlf"] Mar 18 12:41:47 crc kubenswrapper[4975]: I0318 12:41:47.613969 4975 generic.go:334] "Generic (PLEG): container finished" podID="f63fa7cd-428d-4f0b-8314-16626ff2321d" containerID="fae582b07c7123ef64e6f6633a6ea0ff524d6a5c8742b4f04061079f4cc8186c" exitCode=0 Mar 18 12:41:47 crc kubenswrapper[4975]: I0318 12:41:47.614034 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dltlf" event={"ID":"f63fa7cd-428d-4f0b-8314-16626ff2321d","Type":"ContainerDied","Data":"fae582b07c7123ef64e6f6633a6ea0ff524d6a5c8742b4f04061079f4cc8186c"} Mar 18 12:41:47 crc kubenswrapper[4975]: I0318 12:41:47.614366 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dltlf" event={"ID":"f63fa7cd-428d-4f0b-8314-16626ff2321d","Type":"ContainerStarted","Data":"e99eb36030e78020e08404ac89338f651e0bddfb150817409b208107fcdb0d66"} Mar 18 12:41:47 crc kubenswrapper[4975]: I0318 12:41:47.618019 4975 generic.go:334] "Generic (PLEG): container finished" podID="68d90357-cf04-4517-8359-a910ea68d7ad" containerID="bb80e2c4005ddc2c569f755f9e2b2446c665c10f133ded085bc3a5d87fcd272c" exitCode=0 Mar 18 12:41:47 crc kubenswrapper[4975]: I0318 12:41:47.618068 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtn72" event={"ID":"68d90357-cf04-4517-8359-a910ea68d7ad","Type":"ContainerDied","Data":"bb80e2c4005ddc2c569f755f9e2b2446c665c10f133ded085bc3a5d87fcd272c"} Mar 18 12:41:48 crc kubenswrapper[4975]: I0318 12:41:48.628281 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtn72" event={"ID":"68d90357-cf04-4517-8359-a910ea68d7ad","Type":"ContainerStarted","Data":"370c83d13520e89d0c971014d5f50b044b07292bc042f078acba3a92d02e503e"} Mar 18 12:41:48 crc kubenswrapper[4975]: I0318 12:41:48.630995 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dltlf" event={"ID":"f63fa7cd-428d-4f0b-8314-16626ff2321d","Type":"ContainerStarted","Data":"e2de1dcbf66df91a3c08317a67f7f014cf70bdbd78a2815fde242d351586ad88"} Mar 18 12:41:48 crc kubenswrapper[4975]: I0318 12:41:48.652142 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gtn72" podStartSLOduration=3.183885355 podStartE2EDuration="7.652116332s" podCreationTimestamp="2026-03-18 12:41:41 +0000 UTC" firstStartedPulling="2026-03-18 12:41:43.567522264 +0000 UTC m=+1889.281922833" lastFinishedPulling="2026-03-18 12:41:48.035753241 +0000 UTC m=+1893.750153810" observedRunningTime="2026-03-18 12:41:48.644985956 +0000 UTC m=+1894.359386545" watchObservedRunningTime="2026-03-18 12:41:48.652116332 +0000 UTC m=+1894.366516911" Mar 18 12:41:49 crc kubenswrapper[4975]: I0318 12:41:49.641694 4975 generic.go:334] "Generic (PLEG): container finished" podID="f63fa7cd-428d-4f0b-8314-16626ff2321d" containerID="e2de1dcbf66df91a3c08317a67f7f014cf70bdbd78a2815fde242d351586ad88" exitCode=0 Mar 18 12:41:49 crc kubenswrapper[4975]: I0318 12:41:49.641798 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dltlf" event={"ID":"f63fa7cd-428d-4f0b-8314-16626ff2321d","Type":"ContainerDied","Data":"e2de1dcbf66df91a3c08317a67f7f014cf70bdbd78a2815fde242d351586ad88"} Mar 18 12:41:50 crc kubenswrapper[4975]: I0318 12:41:50.654007 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dltlf" event={"ID":"f63fa7cd-428d-4f0b-8314-16626ff2321d","Type":"ContainerStarted","Data":"ae3a4b970475655786ca538ae20f50692cdf4b1445e687cd8c5774b0218bfc18"} Mar 18 12:41:50 crc kubenswrapper[4975]: I0318 12:41:50.676677 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dltlf" podStartSLOduration=2.257254088 podStartE2EDuration="4.676656985s" podCreationTimestamp="2026-03-18 12:41:46 +0000 UTC" firstStartedPulling="2026-03-18 12:41:47.616573171 +0000 UTC m=+1893.330973750" lastFinishedPulling="2026-03-18 12:41:50.035976048 +0000 UTC m=+1895.750376647" observedRunningTime="2026-03-18 12:41:50.672279375 +0000 UTC m=+1896.386679954" watchObservedRunningTime="2026-03-18 12:41:50.676656985 +0000 UTC m=+1896.391057574" Mar 18 12:41:52 crc kubenswrapper[4975]: I0318 12:41:52.325805 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:41:52 crc kubenswrapper[4975]: I0318 12:41:52.329198 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:41:53 crc kubenswrapper[4975]: I0318 12:41:53.389140 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gtn72" podUID="68d90357-cf04-4517-8359-a910ea68d7ad" containerName="registry-server" probeResult="failure" output=< Mar 18 12:41:53 crc kubenswrapper[4975]: timeout: failed to connect service ":50051" within 1s Mar 18 12:41:53 crc kubenswrapper[4975]: > Mar 18 12:41:56 crc kubenswrapper[4975]: I0318 12:41:56.913522 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:41:56 crc kubenswrapper[4975]: I0318 12:41:56.914778 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:41:56 crc kubenswrapper[4975]: I0318 12:41:56.958140 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:41:57 crc kubenswrapper[4975]: I0318 12:41:57.771173 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:41:59 crc kubenswrapper[4975]: I0318 12:41:59.306058 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dltlf"] Mar 18 12:41:59 crc kubenswrapper[4975]: I0318 12:41:59.727473 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dltlf" podUID="f63fa7cd-428d-4f0b-8314-16626ff2321d" containerName="registry-server" containerID="cri-o://ae3a4b970475655786ca538ae20f50692cdf4b1445e687cd8c5774b0218bfc18" gracePeriod=2 Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.148958 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563962-jg5vd"] Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.152439 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563962-jg5vd" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.161154 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563962-jg5vd"] Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.166003 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.166252 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.166680 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.246663 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.259607 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctj5v\" (UniqueName: \"kubernetes.io/projected/e80a4707-2d9f-43da-864c-c970bee90b18-kube-api-access-ctj5v\") pod \"auto-csr-approver-29563962-jg5vd\" (UID: \"e80a4707-2d9f-43da-864c-c970bee90b18\") " pod="openshift-infra/auto-csr-approver-29563962-jg5vd" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.360704 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5gl8\" (UniqueName: \"kubernetes.io/projected/f63fa7cd-428d-4f0b-8314-16626ff2321d-kube-api-access-s5gl8\") pod \"f63fa7cd-428d-4f0b-8314-16626ff2321d\" (UID: \"f63fa7cd-428d-4f0b-8314-16626ff2321d\") " Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.360794 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63fa7cd-428d-4f0b-8314-16626ff2321d-utilities\") pod \"f63fa7cd-428d-4f0b-8314-16626ff2321d\" (UID: \"f63fa7cd-428d-4f0b-8314-16626ff2321d\") " Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.360883 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63fa7cd-428d-4f0b-8314-16626ff2321d-catalog-content\") pod \"f63fa7cd-428d-4f0b-8314-16626ff2321d\" (UID: \"f63fa7cd-428d-4f0b-8314-16626ff2321d\") " Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.361302 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctj5v\" (UniqueName: \"kubernetes.io/projected/e80a4707-2d9f-43da-864c-c970bee90b18-kube-api-access-ctj5v\") pod \"auto-csr-approver-29563962-jg5vd\" (UID: \"e80a4707-2d9f-43da-864c-c970bee90b18\") " pod="openshift-infra/auto-csr-approver-29563962-jg5vd" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.362635 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63fa7cd-428d-4f0b-8314-16626ff2321d-utilities" (OuterVolumeSpecName: "utilities") pod "f63fa7cd-428d-4f0b-8314-16626ff2321d" (UID: "f63fa7cd-428d-4f0b-8314-16626ff2321d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.367649 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f63fa7cd-428d-4f0b-8314-16626ff2321d-kube-api-access-s5gl8" (OuterVolumeSpecName: "kube-api-access-s5gl8") pod "f63fa7cd-428d-4f0b-8314-16626ff2321d" (UID: "f63fa7cd-428d-4f0b-8314-16626ff2321d"). InnerVolumeSpecName "kube-api-access-s5gl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.383336 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctj5v\" (UniqueName: \"kubernetes.io/projected/e80a4707-2d9f-43da-864c-c970bee90b18-kube-api-access-ctj5v\") pod \"auto-csr-approver-29563962-jg5vd\" (UID: \"e80a4707-2d9f-43da-864c-c970bee90b18\") " pod="openshift-infra/auto-csr-approver-29563962-jg5vd" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.410588 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63fa7cd-428d-4f0b-8314-16626ff2321d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f63fa7cd-428d-4f0b-8314-16626ff2321d" (UID: "f63fa7cd-428d-4f0b-8314-16626ff2321d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.462925 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5gl8\" (UniqueName: \"kubernetes.io/projected/f63fa7cd-428d-4f0b-8314-16626ff2321d-kube-api-access-s5gl8\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.462964 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63fa7cd-428d-4f0b-8314-16626ff2321d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.462978 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63fa7cd-428d-4f0b-8314-16626ff2321d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.559839 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563962-jg5vd" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.746397 4975 generic.go:334] "Generic (PLEG): container finished" podID="f63fa7cd-428d-4f0b-8314-16626ff2321d" containerID="ae3a4b970475655786ca538ae20f50692cdf4b1445e687cd8c5774b0218bfc18" exitCode=0 Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.746472 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dltlf" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.746478 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dltlf" event={"ID":"f63fa7cd-428d-4f0b-8314-16626ff2321d","Type":"ContainerDied","Data":"ae3a4b970475655786ca538ae20f50692cdf4b1445e687cd8c5774b0218bfc18"} Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.747354 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dltlf" event={"ID":"f63fa7cd-428d-4f0b-8314-16626ff2321d","Type":"ContainerDied","Data":"e99eb36030e78020e08404ac89338f651e0bddfb150817409b208107fcdb0d66"} Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.747375 4975 scope.go:117] "RemoveContainer" containerID="ae3a4b970475655786ca538ae20f50692cdf4b1445e687cd8c5774b0218bfc18" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.795387 4975 scope.go:117] "RemoveContainer" containerID="e2de1dcbf66df91a3c08317a67f7f014cf70bdbd78a2815fde242d351586ad88" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.809214 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dltlf"] Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.827540 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dltlf"] Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.829337 4975 scope.go:117] "RemoveContainer" containerID="fae582b07c7123ef64e6f6633a6ea0ff524d6a5c8742b4f04061079f4cc8186c" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.905662 4975 scope.go:117] "RemoveContainer" containerID="ae3a4b970475655786ca538ae20f50692cdf4b1445e687cd8c5774b0218bfc18" Mar 18 12:42:00 crc kubenswrapper[4975]: E0318 12:42:00.906168 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae3a4b970475655786ca538ae20f50692cdf4b1445e687cd8c5774b0218bfc18\": container with ID starting with ae3a4b970475655786ca538ae20f50692cdf4b1445e687cd8c5774b0218bfc18 not found: ID does not exist" containerID="ae3a4b970475655786ca538ae20f50692cdf4b1445e687cd8c5774b0218bfc18" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.906218 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae3a4b970475655786ca538ae20f50692cdf4b1445e687cd8c5774b0218bfc18"} err="failed to get container status \"ae3a4b970475655786ca538ae20f50692cdf4b1445e687cd8c5774b0218bfc18\": rpc error: code = NotFound desc = could not find container \"ae3a4b970475655786ca538ae20f50692cdf4b1445e687cd8c5774b0218bfc18\": container with ID starting with ae3a4b970475655786ca538ae20f50692cdf4b1445e687cd8c5774b0218bfc18 not found: ID does not exist" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.906248 4975 scope.go:117] "RemoveContainer" containerID="e2de1dcbf66df91a3c08317a67f7f014cf70bdbd78a2815fde242d351586ad88" Mar 18 12:42:00 crc kubenswrapper[4975]: E0318 12:42:00.906614 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2de1dcbf66df91a3c08317a67f7f014cf70bdbd78a2815fde242d351586ad88\": container with ID starting with e2de1dcbf66df91a3c08317a67f7f014cf70bdbd78a2815fde242d351586ad88 not found: ID does not exist" containerID="e2de1dcbf66df91a3c08317a67f7f014cf70bdbd78a2815fde242d351586ad88" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.906640 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2de1dcbf66df91a3c08317a67f7f014cf70bdbd78a2815fde242d351586ad88"} err="failed to get container status \"e2de1dcbf66df91a3c08317a67f7f014cf70bdbd78a2815fde242d351586ad88\": rpc error: code = NotFound desc = could not find container \"e2de1dcbf66df91a3c08317a67f7f014cf70bdbd78a2815fde242d351586ad88\": container with ID starting with e2de1dcbf66df91a3c08317a67f7f014cf70bdbd78a2815fde242d351586ad88 not found: ID does not exist" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.906655 4975 scope.go:117] "RemoveContainer" containerID="fae582b07c7123ef64e6f6633a6ea0ff524d6a5c8742b4f04061079f4cc8186c" Mar 18 12:42:00 crc kubenswrapper[4975]: E0318 12:42:00.908126 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fae582b07c7123ef64e6f6633a6ea0ff524d6a5c8742b4f04061079f4cc8186c\": container with ID starting with fae582b07c7123ef64e6f6633a6ea0ff524d6a5c8742b4f04061079f4cc8186c not found: ID does not exist" containerID="fae582b07c7123ef64e6f6633a6ea0ff524d6a5c8742b4f04061079f4cc8186c" Mar 18 12:42:00 crc kubenswrapper[4975]: I0318 12:42:00.908170 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fae582b07c7123ef64e6f6633a6ea0ff524d6a5c8742b4f04061079f4cc8186c"} err="failed to get container status \"fae582b07c7123ef64e6f6633a6ea0ff524d6a5c8742b4f04061079f4cc8186c\": rpc error: code = NotFound desc = could not find container \"fae582b07c7123ef64e6f6633a6ea0ff524d6a5c8742b4f04061079f4cc8186c\": container with ID starting with fae582b07c7123ef64e6f6633a6ea0ff524d6a5c8742b4f04061079f4cc8186c not found: ID does not exist" Mar 18 12:42:01 crc kubenswrapper[4975]: I0318 12:42:01.027197 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f63fa7cd-428d-4f0b-8314-16626ff2321d" path="/var/lib/kubelet/pods/f63fa7cd-428d-4f0b-8314-16626ff2321d/volumes" Mar 18 12:42:01 crc kubenswrapper[4975]: I0318 12:42:01.120993 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563962-jg5vd"] Mar 18 12:42:01 crc kubenswrapper[4975]: W0318 12:42:01.129629 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode80a4707_2d9f_43da_864c_c970bee90b18.slice/crio-acf9f17d919bf2d5696a1380c6b6993126e7452cf32eb2e9dbae52c4815a9b9a WatchSource:0}: Error finding container acf9f17d919bf2d5696a1380c6b6993126e7452cf32eb2e9dbae52c4815a9b9a: Status 404 returned error can't find the container with id acf9f17d919bf2d5696a1380c6b6993126e7452cf32eb2e9dbae52c4815a9b9a Mar 18 12:42:01 crc kubenswrapper[4975]: I0318 12:42:01.760345 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563962-jg5vd" event={"ID":"e80a4707-2d9f-43da-864c-c970bee90b18","Type":"ContainerStarted","Data":"acf9f17d919bf2d5696a1380c6b6993126e7452cf32eb2e9dbae52c4815a9b9a"} Mar 18 12:42:02 crc kubenswrapper[4975]: I0318 12:42:02.411954 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:42:02 crc kubenswrapper[4975]: I0318 12:42:02.460975 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:42:02 crc kubenswrapper[4975]: I0318 12:42:02.771979 4975 generic.go:334] "Generic (PLEG): container finished" podID="e80a4707-2d9f-43da-864c-c970bee90b18" containerID="14c2361ca3b33c8d60d613e9c1f4341986a62e181b9a187575e93598feda6da2" exitCode=0 Mar 18 12:42:02 crc kubenswrapper[4975]: I0318 12:42:02.772091 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563962-jg5vd" event={"ID":"e80a4707-2d9f-43da-864c-c970bee90b18","Type":"ContainerDied","Data":"14c2361ca3b33c8d60d613e9c1f4341986a62e181b9a187575e93598feda6da2"} Mar 18 12:42:03 crc kubenswrapper[4975]: I0318 12:42:03.511355 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gtn72"] Mar 18 12:42:03 crc kubenswrapper[4975]: I0318 12:42:03.784650 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gtn72" podUID="68d90357-cf04-4517-8359-a910ea68d7ad" containerName="registry-server" containerID="cri-o://370c83d13520e89d0c971014d5f50b044b07292bc042f078acba3a92d02e503e" gracePeriod=2 Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.115642 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563962-jg5vd" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.239074 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.248334 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctj5v\" (UniqueName: \"kubernetes.io/projected/e80a4707-2d9f-43da-864c-c970bee90b18-kube-api-access-ctj5v\") pod \"e80a4707-2d9f-43da-864c-c970bee90b18\" (UID: \"e80a4707-2d9f-43da-864c-c970bee90b18\") " Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.254594 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e80a4707-2d9f-43da-864c-c970bee90b18-kube-api-access-ctj5v" (OuterVolumeSpecName: "kube-api-access-ctj5v") pod "e80a4707-2d9f-43da-864c-c970bee90b18" (UID: "e80a4707-2d9f-43da-864c-c970bee90b18"). InnerVolumeSpecName "kube-api-access-ctj5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.349630 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hxkf\" (UniqueName: \"kubernetes.io/projected/68d90357-cf04-4517-8359-a910ea68d7ad-kube-api-access-8hxkf\") pod \"68d90357-cf04-4517-8359-a910ea68d7ad\" (UID: \"68d90357-cf04-4517-8359-a910ea68d7ad\") " Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.349863 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d90357-cf04-4517-8359-a910ea68d7ad-utilities\") pod \"68d90357-cf04-4517-8359-a910ea68d7ad\" (UID: \"68d90357-cf04-4517-8359-a910ea68d7ad\") " Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.350003 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d90357-cf04-4517-8359-a910ea68d7ad-catalog-content\") pod \"68d90357-cf04-4517-8359-a910ea68d7ad\" (UID: \"68d90357-cf04-4517-8359-a910ea68d7ad\") " Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.350503 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68d90357-cf04-4517-8359-a910ea68d7ad-utilities" (OuterVolumeSpecName: "utilities") pod "68d90357-cf04-4517-8359-a910ea68d7ad" (UID: "68d90357-cf04-4517-8359-a910ea68d7ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.350614 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctj5v\" (UniqueName: \"kubernetes.io/projected/e80a4707-2d9f-43da-864c-c970bee90b18-kube-api-access-ctj5v\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.354217 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d90357-cf04-4517-8359-a910ea68d7ad-kube-api-access-8hxkf" (OuterVolumeSpecName: "kube-api-access-8hxkf") pod "68d90357-cf04-4517-8359-a910ea68d7ad" (UID: "68d90357-cf04-4517-8359-a910ea68d7ad"). InnerVolumeSpecName "kube-api-access-8hxkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.471728 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hxkf\" (UniqueName: \"kubernetes.io/projected/68d90357-cf04-4517-8359-a910ea68d7ad-kube-api-access-8hxkf\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.472080 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68d90357-cf04-4517-8359-a910ea68d7ad-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.505696 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68d90357-cf04-4517-8359-a910ea68d7ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68d90357-cf04-4517-8359-a910ea68d7ad" (UID: "68d90357-cf04-4517-8359-a910ea68d7ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.573253 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68d90357-cf04-4517-8359-a910ea68d7ad-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.797268 4975 generic.go:334] "Generic (PLEG): container finished" podID="68d90357-cf04-4517-8359-a910ea68d7ad" containerID="370c83d13520e89d0c971014d5f50b044b07292bc042f078acba3a92d02e503e" exitCode=0 Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.797304 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtn72" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.797328 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtn72" event={"ID":"68d90357-cf04-4517-8359-a910ea68d7ad","Type":"ContainerDied","Data":"370c83d13520e89d0c971014d5f50b044b07292bc042f078acba3a92d02e503e"} Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.797369 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtn72" event={"ID":"68d90357-cf04-4517-8359-a910ea68d7ad","Type":"ContainerDied","Data":"78d90c83735e2205948a43de5d97c87aa2e5c007968e7b250a62c1a9376975f2"} Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.797392 4975 scope.go:117] "RemoveContainer" containerID="370c83d13520e89d0c971014d5f50b044b07292bc042f078acba3a92d02e503e" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.800165 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563962-jg5vd" event={"ID":"e80a4707-2d9f-43da-864c-c970bee90b18","Type":"ContainerDied","Data":"acf9f17d919bf2d5696a1380c6b6993126e7452cf32eb2e9dbae52c4815a9b9a"} Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.800192 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acf9f17d919bf2d5696a1380c6b6993126e7452cf32eb2e9dbae52c4815a9b9a" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.800226 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563962-jg5vd" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.824844 4975 scope.go:117] "RemoveContainer" containerID="bb80e2c4005ddc2c569f755f9e2b2446c665c10f133ded085bc3a5d87fcd272c" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.836749 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gtn72"] Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.849018 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gtn72"] Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.854720 4975 scope.go:117] "RemoveContainer" containerID="67a9612fb3ff592bb466f5e6d62cc8b2a6ab38558c7b46d41b38686171de38ca" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.870930 4975 scope.go:117] "RemoveContainer" containerID="370c83d13520e89d0c971014d5f50b044b07292bc042f078acba3a92d02e503e" Mar 18 12:42:04 crc kubenswrapper[4975]: E0318 12:42:04.871390 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"370c83d13520e89d0c971014d5f50b044b07292bc042f078acba3a92d02e503e\": container with ID starting with 370c83d13520e89d0c971014d5f50b044b07292bc042f078acba3a92d02e503e not found: ID does not exist" containerID="370c83d13520e89d0c971014d5f50b044b07292bc042f078acba3a92d02e503e" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.871422 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"370c83d13520e89d0c971014d5f50b044b07292bc042f078acba3a92d02e503e"} err="failed to get container status \"370c83d13520e89d0c971014d5f50b044b07292bc042f078acba3a92d02e503e\": rpc error: code = NotFound desc = could not find container \"370c83d13520e89d0c971014d5f50b044b07292bc042f078acba3a92d02e503e\": container with ID starting with 370c83d13520e89d0c971014d5f50b044b07292bc042f078acba3a92d02e503e not found: ID does not exist" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.871446 4975 scope.go:117] "RemoveContainer" containerID="bb80e2c4005ddc2c569f755f9e2b2446c665c10f133ded085bc3a5d87fcd272c" Mar 18 12:42:04 crc kubenswrapper[4975]: E0318 12:42:04.872227 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb80e2c4005ddc2c569f755f9e2b2446c665c10f133ded085bc3a5d87fcd272c\": container with ID starting with bb80e2c4005ddc2c569f755f9e2b2446c665c10f133ded085bc3a5d87fcd272c not found: ID does not exist" containerID="bb80e2c4005ddc2c569f755f9e2b2446c665c10f133ded085bc3a5d87fcd272c" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.872615 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb80e2c4005ddc2c569f755f9e2b2446c665c10f133ded085bc3a5d87fcd272c"} err="failed to get container status \"bb80e2c4005ddc2c569f755f9e2b2446c665c10f133ded085bc3a5d87fcd272c\": rpc error: code = NotFound desc = could not find container \"bb80e2c4005ddc2c569f755f9e2b2446c665c10f133ded085bc3a5d87fcd272c\": container with ID starting with bb80e2c4005ddc2c569f755f9e2b2446c665c10f133ded085bc3a5d87fcd272c not found: ID does not exist" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.872662 4975 scope.go:117] "RemoveContainer" containerID="67a9612fb3ff592bb466f5e6d62cc8b2a6ab38558c7b46d41b38686171de38ca" Mar 18 12:42:04 crc kubenswrapper[4975]: E0318 12:42:04.872993 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67a9612fb3ff592bb466f5e6d62cc8b2a6ab38558c7b46d41b38686171de38ca\": container with ID starting with 67a9612fb3ff592bb466f5e6d62cc8b2a6ab38558c7b46d41b38686171de38ca not found: ID does not exist" containerID="67a9612fb3ff592bb466f5e6d62cc8b2a6ab38558c7b46d41b38686171de38ca" Mar 18 12:42:04 crc kubenswrapper[4975]: I0318 12:42:04.873018 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a9612fb3ff592bb466f5e6d62cc8b2a6ab38558c7b46d41b38686171de38ca"} err="failed to get container status \"67a9612fb3ff592bb466f5e6d62cc8b2a6ab38558c7b46d41b38686171de38ca\": rpc error: code = NotFound desc = could not find container \"67a9612fb3ff592bb466f5e6d62cc8b2a6ab38558c7b46d41b38686171de38ca\": container with ID starting with 67a9612fb3ff592bb466f5e6d62cc8b2a6ab38558c7b46d41b38686171de38ca not found: ID does not exist" Mar 18 12:42:05 crc kubenswrapper[4975]: I0318 12:42:05.030413 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d90357-cf04-4517-8359-a910ea68d7ad" path="/var/lib/kubelet/pods/68d90357-cf04-4517-8359-a910ea68d7ad/volumes" Mar 18 12:42:05 crc kubenswrapper[4975]: I0318 12:42:05.200380 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563956-lqdr4"] Mar 18 12:42:05 crc kubenswrapper[4975]: I0318 12:42:05.210593 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563956-lqdr4"] Mar 18 12:42:07 crc kubenswrapper[4975]: I0318 12:42:07.026223 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65f54532-e4a2-4f73-b254-aa7e0bd6a04e" path="/var/lib/kubelet/pods/65f54532-e4a2-4f73-b254-aa7e0bd6a04e/volumes" Mar 18 12:42:37 crc kubenswrapper[4975]: I0318 12:42:37.117376 4975 generic.go:334] "Generic (PLEG): container finished" podID="296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd" containerID="5a4b072b79bb29a1fbdeecdf177f36476867bafcf980414975bd3a448882f755" exitCode=0 Mar 18 12:42:37 crc kubenswrapper[4975]: I0318 12:42:37.117484 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" event={"ID":"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd","Type":"ContainerDied","Data":"5a4b072b79bb29a1fbdeecdf177f36476867bafcf980414975bd3a448882f755"} Mar 18 12:42:38 crc kubenswrapper[4975]: I0318 12:42:38.603914 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:42:38 crc kubenswrapper[4975]: I0318 12:42:38.763732 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc2mv\" (UniqueName: \"kubernetes.io/projected/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-kube-api-access-bc2mv\") pod \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " Mar 18 12:42:38 crc kubenswrapper[4975]: I0318 12:42:38.763808 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-bootstrap-combined-ca-bundle\") pod \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " Mar 18 12:42:38 crc kubenswrapper[4975]: I0318 12:42:38.763967 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-inventory\") pod \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " Mar 18 12:42:38 crc kubenswrapper[4975]: I0318 12:42:38.764042 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-ssh-key-openstack-edpm-ipam\") pod \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\" (UID: \"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd\") " Mar 18 12:42:38 crc kubenswrapper[4975]: I0318 12:42:38.769993 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-kube-api-access-bc2mv" (OuterVolumeSpecName: "kube-api-access-bc2mv") pod "296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd" (UID: "296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd"). InnerVolumeSpecName "kube-api-access-bc2mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:42:38 crc kubenswrapper[4975]: I0318 12:42:38.770002 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd" (UID: "296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:42:38 crc kubenswrapper[4975]: I0318 12:42:38.816757 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd" (UID: "296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:42:38 crc kubenswrapper[4975]: I0318 12:42:38.822295 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-inventory" (OuterVolumeSpecName: "inventory") pod "296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd" (UID: "296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:42:38 crc kubenswrapper[4975]: I0318 12:42:38.866811 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc2mv\" (UniqueName: \"kubernetes.io/projected/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-kube-api-access-bc2mv\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:38 crc kubenswrapper[4975]: I0318 12:42:38.866847 4975 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:38 crc kubenswrapper[4975]: I0318 12:42:38.866857 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:38 crc kubenswrapper[4975]: I0318 12:42:38.866955 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.136211 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" event={"ID":"296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd","Type":"ContainerDied","Data":"aced124078e8b32690e90a39901281e481ecebddda26eb5a7e51587025c45ffd"} Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.136664 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aced124078e8b32690e90a39901281e481ecebddda26eb5a7e51587025c45ffd" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.136309 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.251757 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms"] Mar 18 12:42:39 crc kubenswrapper[4975]: E0318 12:42:39.252583 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d90357-cf04-4517-8359-a910ea68d7ad" containerName="registry-server" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.252687 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d90357-cf04-4517-8359-a910ea68d7ad" containerName="registry-server" Mar 18 12:42:39 crc kubenswrapper[4975]: E0318 12:42:39.252809 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63fa7cd-428d-4f0b-8314-16626ff2321d" containerName="registry-server" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.252921 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63fa7cd-428d-4f0b-8314-16626ff2321d" containerName="registry-server" Mar 18 12:42:39 crc kubenswrapper[4975]: E0318 12:42:39.253010 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d90357-cf04-4517-8359-a910ea68d7ad" containerName="extract-content" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.253080 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d90357-cf04-4517-8359-a910ea68d7ad" containerName="extract-content" Mar 18 12:42:39 crc kubenswrapper[4975]: E0318 12:42:39.253203 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d90357-cf04-4517-8359-a910ea68d7ad" containerName="extract-utilities" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.253275 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d90357-cf04-4517-8359-a910ea68d7ad" containerName="extract-utilities" Mar 18 12:42:39 crc kubenswrapper[4975]: E0318 12:42:39.253349 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63fa7cd-428d-4f0b-8314-16626ff2321d" containerName="extract-utilities" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.253434 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63fa7cd-428d-4f0b-8314-16626ff2321d" containerName="extract-utilities" Mar 18 12:42:39 crc kubenswrapper[4975]: E0318 12:42:39.253530 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.253619 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 12:42:39 crc kubenswrapper[4975]: E0318 12:42:39.253720 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e80a4707-2d9f-43da-864c-c970bee90b18" containerName="oc" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.253791 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="e80a4707-2d9f-43da-864c-c970bee90b18" containerName="oc" Mar 18 12:42:39 crc kubenswrapper[4975]: E0318 12:42:39.253881 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63fa7cd-428d-4f0b-8314-16626ff2321d" containerName="extract-content" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.253969 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63fa7cd-428d-4f0b-8314-16626ff2321d" containerName="extract-content" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.254284 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f63fa7cd-428d-4f0b-8314-16626ff2321d" containerName="registry-server" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.254373 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="e80a4707-2d9f-43da-864c-c970bee90b18" containerName="oc" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.254472 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d90357-cf04-4517-8359-a910ea68d7ad" containerName="registry-server" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.254579 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.255445 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.258501 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.258760 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.259088 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.259297 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.260352 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms"] Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.377768 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87jbc\" (UniqueName: \"kubernetes.io/projected/a56e99ba-eb18-4a6a-8347-564e7af719f7-kube-api-access-87jbc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kckms\" (UID: \"a56e99ba-eb18-4a6a-8347-564e7af719f7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.378067 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a56e99ba-eb18-4a6a-8347-564e7af719f7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kckms\" (UID: \"a56e99ba-eb18-4a6a-8347-564e7af719f7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.378176 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a56e99ba-eb18-4a6a-8347-564e7af719f7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kckms\" (UID: \"a56e99ba-eb18-4a6a-8347-564e7af719f7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.479844 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87jbc\" (UniqueName: \"kubernetes.io/projected/a56e99ba-eb18-4a6a-8347-564e7af719f7-kube-api-access-87jbc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kckms\" (UID: \"a56e99ba-eb18-4a6a-8347-564e7af719f7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.479995 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a56e99ba-eb18-4a6a-8347-564e7af719f7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kckms\" (UID: \"a56e99ba-eb18-4a6a-8347-564e7af719f7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.480036 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a56e99ba-eb18-4a6a-8347-564e7af719f7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kckms\" (UID: \"a56e99ba-eb18-4a6a-8347-564e7af719f7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.484889 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a56e99ba-eb18-4a6a-8347-564e7af719f7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kckms\" (UID: \"a56e99ba-eb18-4a6a-8347-564e7af719f7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.485812 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a56e99ba-eb18-4a6a-8347-564e7af719f7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kckms\" (UID: \"a56e99ba-eb18-4a6a-8347-564e7af719f7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.501245 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87jbc\" (UniqueName: \"kubernetes.io/projected/a56e99ba-eb18-4a6a-8347-564e7af719f7-kube-api-access-87jbc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kckms\" (UID: \"a56e99ba-eb18-4a6a-8347-564e7af719f7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" Mar 18 12:42:39 crc kubenswrapper[4975]: I0318 12:42:39.620381 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" Mar 18 12:42:40 crc kubenswrapper[4975]: I0318 12:42:40.160346 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms"] Mar 18 12:42:41 crc kubenswrapper[4975]: I0318 12:42:41.156913 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" event={"ID":"a56e99ba-eb18-4a6a-8347-564e7af719f7","Type":"ContainerStarted","Data":"aaf423f18ae17025d9cac4bcdd57dbd257b4848c92533e911f994303a4ce1ff8"} Mar 18 12:42:41 crc kubenswrapper[4975]: I0318 12:42:41.157439 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" event={"ID":"a56e99ba-eb18-4a6a-8347-564e7af719f7","Type":"ContainerStarted","Data":"99c55378332d5150824d82d124c7b8f802e397b41f1f7f64542e211c031a6417"} Mar 18 12:42:41 crc kubenswrapper[4975]: I0318 12:42:41.173184 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" podStartSLOduration=1.9786055519999999 podStartE2EDuration="2.173156458s" podCreationTimestamp="2026-03-18 12:42:39 +0000 UTC" firstStartedPulling="2026-03-18 12:42:40.165469793 +0000 UTC m=+1945.879870372" lastFinishedPulling="2026-03-18 12:42:40.360020699 +0000 UTC m=+1946.074421278" observedRunningTime="2026-03-18 12:42:41.169386235 +0000 UTC m=+1946.883786814" watchObservedRunningTime="2026-03-18 12:42:41.173156458 +0000 UTC m=+1946.887557037" Mar 18 12:42:44 crc kubenswrapper[4975]: I0318 12:42:44.922348 4975 scope.go:117] "RemoveContainer" containerID="ca516b3155c9da471d057489d115c2e6bef88ebe8795fcf12023d49f4b4ba354" Mar 18 12:42:44 crc kubenswrapper[4975]: I0318 12:42:44.970211 4975 scope.go:117] "RemoveContainer" containerID="e8ba2729244676f1858374ecbed1b560718997ed1c67174df2906da1583c0d87" Mar 18 12:42:45 crc kubenswrapper[4975]: I0318 12:42:45.004993 4975 scope.go:117] "RemoveContainer" containerID="9d93390abf911311725171f9f519ce9a1a6fbb812a98a7ed5963498942bfa107" Mar 18 12:42:45 crc kubenswrapper[4975]: I0318 12:42:45.045309 4975 scope.go:117] "RemoveContainer" containerID="124d2e2517104363e8f9ab34b0b1d98cd22494286dbe7daa1c67c6b3ec0ba82f" Mar 18 12:42:45 crc kubenswrapper[4975]: I0318 12:42:45.081363 4975 scope.go:117] "RemoveContainer" containerID="fa27b19dfc9be706f4c8776b4350a5923206f6870cdb7ddd91f8af3798c0f0e0" Mar 18 12:42:45 crc kubenswrapper[4975]: I0318 12:42:45.119686 4975 scope.go:117] "RemoveContainer" containerID="a2ec6b75355cff0b12414d356a935a6d88d3be020d844ed1a3a688b4d579039e" Mar 18 12:42:45 crc kubenswrapper[4975]: I0318 12:42:45.145073 4975 scope.go:117] "RemoveContainer" containerID="9f4abdac8a295aa8a7aafa09479fde44299b6b5d719a5e7717f3048eee20c81c" Mar 18 12:42:49 crc kubenswrapper[4975]: I0318 12:42:49.039103 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-s8rhn"] Mar 18 12:42:49 crc kubenswrapper[4975]: I0318 12:42:49.048535 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-s8rhn"] Mar 18 12:42:50 crc kubenswrapper[4975]: I0318 12:42:50.029175 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-259gn"] Mar 18 12:42:50 crc kubenswrapper[4975]: I0318 12:42:50.039203 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-259gn"] Mar 18 12:42:50 crc kubenswrapper[4975]: I0318 12:42:50.050954 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9e84-account-create-update-sn9bw"] Mar 18 12:42:50 crc kubenswrapper[4975]: I0318 12:42:50.063602 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-s4mlz"] Mar 18 12:42:50 crc kubenswrapper[4975]: I0318 12:42:50.072545 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3b17-account-create-update-rkbqb"] Mar 18 12:42:50 crc kubenswrapper[4975]: I0318 12:42:50.080578 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2797-account-create-update-l44sp"] Mar 18 12:42:50 crc kubenswrapper[4975]: I0318 12:42:50.087336 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3b17-account-create-update-rkbqb"] Mar 18 12:42:50 crc kubenswrapper[4975]: I0318 12:42:50.094126 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-s4mlz"] Mar 18 12:42:50 crc kubenswrapper[4975]: I0318 12:42:50.112039 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9e84-account-create-update-sn9bw"] Mar 18 12:42:50 crc kubenswrapper[4975]: I0318 12:42:50.127610 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2797-account-create-update-l44sp"] Mar 18 12:42:51 crc kubenswrapper[4975]: I0318 12:42:51.034508 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="226bdbb1-a2c1-4bdf-a509-2aaed024a33e" path="/var/lib/kubelet/pods/226bdbb1-a2c1-4bdf-a509-2aaed024a33e/volumes" Mar 18 12:42:51 crc kubenswrapper[4975]: I0318 12:42:51.036363 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a114ef7-ae0b-4502-86f6-5cbacd642fff" path="/var/lib/kubelet/pods/3a114ef7-ae0b-4502-86f6-5cbacd642fff/volumes" Mar 18 12:42:51 crc kubenswrapper[4975]: I0318 12:42:51.037452 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5797d8e6-add0-482e-ab94-24df08d4da60" path="/var/lib/kubelet/pods/5797d8e6-add0-482e-ab94-24df08d4da60/volumes" Mar 18 12:42:51 crc kubenswrapper[4975]: I0318 12:42:51.038633 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="973c781e-29cc-4d02-a22e-1b13a1a18a94" path="/var/lib/kubelet/pods/973c781e-29cc-4d02-a22e-1b13a1a18a94/volumes" Mar 18 12:42:51 crc kubenswrapper[4975]: I0318 12:42:51.040856 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3f9f8bf-5111-465b-b099-5ea2b374ddc9" path="/var/lib/kubelet/pods/b3f9f8bf-5111-465b-b099-5ea2b374ddc9/volumes" Mar 18 12:42:51 crc kubenswrapper[4975]: I0318 12:42:51.041968 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fed45251-cc94-42f5-962c-d82dbd50b421" path="/var/lib/kubelet/pods/fed45251-cc94-42f5-962c-d82dbd50b421/volumes" Mar 18 12:43:11 crc kubenswrapper[4975]: I0318 12:43:11.044528 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-sxldc"] Mar 18 12:43:11 crc kubenswrapper[4975]: I0318 12:43:11.056507 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-sxldc"] Mar 18 12:43:13 crc kubenswrapper[4975]: I0318 12:43:13.032070 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cb80719-3311-49dd-9dbf-2d0c40b3d17b" path="/var/lib/kubelet/pods/8cb80719-3311-49dd-9dbf-2d0c40b3d17b/volumes" Mar 18 12:43:18 crc kubenswrapper[4975]: I0318 12:43:18.032438 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-2zdjk"] Mar 18 12:43:18 crc kubenswrapper[4975]: I0318 12:43:18.044834 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-2zdjk"] Mar 18 12:43:19 crc kubenswrapper[4975]: I0318 12:43:19.029281 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e796a56-c2ec-40c2-8604-b14e71255013" path="/var/lib/kubelet/pods/6e796a56-c2ec-40c2-8604-b14e71255013/volumes" Mar 18 12:43:26 crc kubenswrapper[4975]: I0318 12:43:26.045029 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-gkcrs"] Mar 18 12:43:26 crc kubenswrapper[4975]: I0318 12:43:26.057637 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-647b-account-create-update-d6jph"] Mar 18 12:43:26 crc kubenswrapper[4975]: I0318 12:43:26.070037 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-647b-account-create-update-d6jph"] Mar 18 12:43:26 crc kubenswrapper[4975]: I0318 12:43:26.083709 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-gkcrs"] Mar 18 12:43:27 crc kubenswrapper[4975]: I0318 12:43:27.052834 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c17426b-2e12-4e49-aaee-cb9d0f438552" path="/var/lib/kubelet/pods/0c17426b-2e12-4e49-aaee-cb9d0f438552/volumes" Mar 18 12:43:27 crc kubenswrapper[4975]: I0318 12:43:27.062983 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72df1add-3fec-4e3b-baad-90ba808298f7" path="/var/lib/kubelet/pods/72df1add-3fec-4e3b-baad-90ba808298f7/volumes" Mar 18 12:43:27 crc kubenswrapper[4975]: I0318 12:43:27.106357 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-gjb7f"] Mar 18 12:43:27 crc kubenswrapper[4975]: I0318 12:43:27.114503 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-2d70-account-create-update-mt7tw"] Mar 18 12:43:27 crc kubenswrapper[4975]: I0318 12:43:27.121692 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-gjb7f"] Mar 18 12:43:27 crc kubenswrapper[4975]: I0318 12:43:27.129981 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-2d70-account-create-update-mt7tw"] Mar 18 12:43:27 crc kubenswrapper[4975]: I0318 12:43:27.137751 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d456-account-create-update-htdrw"] Mar 18 12:43:27 crc kubenswrapper[4975]: I0318 12:43:27.147327 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d456-account-create-update-htdrw"] Mar 18 12:43:27 crc kubenswrapper[4975]: I0318 12:43:27.152620 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-vp6r7"] Mar 18 12:43:27 crc kubenswrapper[4975]: I0318 12:43:27.159591 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-vp6r7"] Mar 18 12:43:29 crc kubenswrapper[4975]: I0318 12:43:29.031729 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc880e4-546b-4dce-a6eb-39aef7dd9954" path="/var/lib/kubelet/pods/6bc880e4-546b-4dce-a6eb-39aef7dd9954/volumes" Mar 18 12:43:29 crc kubenswrapper[4975]: I0318 12:43:29.033273 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91116f39-0e09-4921-8ee3-aaff9a89b610" path="/var/lib/kubelet/pods/91116f39-0e09-4921-8ee3-aaff9a89b610/volumes" Mar 18 12:43:29 crc kubenswrapper[4975]: I0318 12:43:29.034568 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a00825d4-1eac-45e1-9eec-bdc12eb450a2" path="/var/lib/kubelet/pods/a00825d4-1eac-45e1-9eec-bdc12eb450a2/volumes" Mar 18 12:43:29 crc kubenswrapper[4975]: I0318 12:43:29.035804 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac2a33aa-f2b3-4da7-9265-cc73a33649d3" path="/var/lib/kubelet/pods/ac2a33aa-f2b3-4da7-9265-cc73a33649d3/volumes" Mar 18 12:43:37 crc kubenswrapper[4975]: I0318 12:43:37.045656 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-8mhpn"] Mar 18 12:43:37 crc kubenswrapper[4975]: I0318 12:43:37.062967 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-8mhpn"] Mar 18 12:43:39 crc kubenswrapper[4975]: I0318 12:43:39.037042 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b50774ab-6273-44c8-ae0c-d6c77c3996fc" path="/var/lib/kubelet/pods/b50774ab-6273-44c8-ae0c-d6c77c3996fc/volumes" Mar 18 12:43:45 crc kubenswrapper[4975]: I0318 12:43:45.327518 4975 scope.go:117] "RemoveContainer" containerID="63d25e271b98d1b2df4b559ffb40d90f136c66fc33c380f3b9ab175917efa285" Mar 18 12:43:45 crc kubenswrapper[4975]: I0318 12:43:45.359842 4975 scope.go:117] "RemoveContainer" containerID="eb47ff94ddb709711a5e754dd2b67ab8628a9790f93a6ba0c102c759803cfc5d" Mar 18 12:43:45 crc kubenswrapper[4975]: I0318 12:43:45.420758 4975 scope.go:117] "RemoveContainer" containerID="72d88262648a062e2023179524c3ef465e7bbbc64312d1ad67b60987e18b1e52" Mar 18 12:43:45 crc kubenswrapper[4975]: I0318 12:43:45.473461 4975 scope.go:117] "RemoveContainer" containerID="92b732168dd7402750871af12b4e19521bfdb46733c09d8f9d4d650652d6cf20" Mar 18 12:43:45 crc kubenswrapper[4975]: I0318 12:43:45.520032 4975 scope.go:117] "RemoveContainer" containerID="134404831f5c1ede754e310da20064f28e8964e6c417b020d3e2cdd14a668ceb" Mar 18 12:43:45 crc kubenswrapper[4975]: I0318 12:43:45.576753 4975 scope.go:117] "RemoveContainer" containerID="2a3c5218797b4769d79c2900fd0fd46380c935d0edaaa8f891be1220154b41d7" Mar 18 12:43:45 crc kubenswrapper[4975]: I0318 12:43:45.615736 4975 scope.go:117] "RemoveContainer" containerID="fc0f34fee1f1e0cfe92586a01c2acab55226856fa34a477aff135d906daabf98" Mar 18 12:43:45 crc kubenswrapper[4975]: I0318 12:43:45.649974 4975 scope.go:117] "RemoveContainer" containerID="2b1e77e0028326720b11a64c53e4be5cf5ffa3a3c2f77b0e75c3f2d06507cfd7" Mar 18 12:43:45 crc kubenswrapper[4975]: I0318 12:43:45.679538 4975 scope.go:117] "RemoveContainer" containerID="26b815d260ed6844b03fe269f50a147f3255454615c68a7e86e847c3f89df0e1" Mar 18 12:43:45 crc kubenswrapper[4975]: I0318 12:43:45.699038 4975 scope.go:117] "RemoveContainer" containerID="a1db9db8ee85e571845d4f26ebe905bbf3c10d184fb10a02baee0eec0365c69a" Mar 18 12:43:45 crc kubenswrapper[4975]: I0318 12:43:45.720627 4975 scope.go:117] "RemoveContainer" containerID="afaf64061f40eb306f05a4ce6035eb6909d7d8ac9907525794e29641facdff88" Mar 18 12:43:45 crc kubenswrapper[4975]: I0318 12:43:45.749536 4975 scope.go:117] "RemoveContainer" containerID="b04f1fba8268cb39373ff8520437c05fbbb63aedf49c5e72dd48f279c95e99a3" Mar 18 12:43:45 crc kubenswrapper[4975]: I0318 12:43:45.788505 4975 scope.go:117] "RemoveContainer" containerID="1b6219446d9d27ddd47965a350cbca26227136a905d64632202c3bd6d80d3a88" Mar 18 12:43:45 crc kubenswrapper[4975]: I0318 12:43:45.807993 4975 scope.go:117] "RemoveContainer" containerID="559409cdd578ac7ebca41857ea3d8fe3faadf749b271bcaffe3bcc2c0c935d2d" Mar 18 12:43:45 crc kubenswrapper[4975]: I0318 12:43:45.829106 4975 scope.go:117] "RemoveContainer" containerID="826b860e46717b42db2c81c919f5720f2d1e3c5faa1dc267d11352c74a9fb51c" Mar 18 12:43:55 crc kubenswrapper[4975]: I0318 12:43:55.539306 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:43:55 crc kubenswrapper[4975]: I0318 12:43:55.540029 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:44:00 crc kubenswrapper[4975]: I0318 12:44:00.177980 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563964-sp8fb"] Mar 18 12:44:00 crc kubenswrapper[4975]: I0318 12:44:00.180643 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563964-sp8fb" Mar 18 12:44:00 crc kubenswrapper[4975]: I0318 12:44:00.184792 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:44:00 crc kubenswrapper[4975]: I0318 12:44:00.185200 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:44:00 crc kubenswrapper[4975]: I0318 12:44:00.185285 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:44:00 crc kubenswrapper[4975]: I0318 12:44:00.187929 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563964-sp8fb"] Mar 18 12:44:00 crc kubenswrapper[4975]: I0318 12:44:00.337175 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlcx7\" (UniqueName: \"kubernetes.io/projected/84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e-kube-api-access-mlcx7\") pod \"auto-csr-approver-29563964-sp8fb\" (UID: \"84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e\") " pod="openshift-infra/auto-csr-approver-29563964-sp8fb" Mar 18 12:44:00 crc kubenswrapper[4975]: I0318 12:44:00.440918 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlcx7\" (UniqueName: \"kubernetes.io/projected/84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e-kube-api-access-mlcx7\") pod \"auto-csr-approver-29563964-sp8fb\" (UID: \"84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e\") " pod="openshift-infra/auto-csr-approver-29563964-sp8fb" Mar 18 12:44:00 crc kubenswrapper[4975]: I0318 12:44:00.470224 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlcx7\" (UniqueName: \"kubernetes.io/projected/84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e-kube-api-access-mlcx7\") pod \"auto-csr-approver-29563964-sp8fb\" (UID: \"84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e\") " pod="openshift-infra/auto-csr-approver-29563964-sp8fb" Mar 18 12:44:00 crc kubenswrapper[4975]: I0318 12:44:00.505733 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563964-sp8fb" Mar 18 12:44:00 crc kubenswrapper[4975]: W0318 12:44:00.815128 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84d3a57d_c7cc_4690_8e5a_f68bc8eb4d1e.slice/crio-09243f44db31bd9061d2d9e503158cb3d9be67ace8caa1af7f0ed75a47df0071 WatchSource:0}: Error finding container 09243f44db31bd9061d2d9e503158cb3d9be67ace8caa1af7f0ed75a47df0071: Status 404 returned error can't find the container with id 09243f44db31bd9061d2d9e503158cb3d9be67ace8caa1af7f0ed75a47df0071 Mar 18 12:44:00 crc kubenswrapper[4975]: I0318 12:44:00.816119 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563964-sp8fb"] Mar 18 12:44:01 crc kubenswrapper[4975]: I0318 12:44:01.073308 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563964-sp8fb" event={"ID":"84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e","Type":"ContainerStarted","Data":"09243f44db31bd9061d2d9e503158cb3d9be67ace8caa1af7f0ed75a47df0071"} Mar 18 12:44:03 crc kubenswrapper[4975]: I0318 12:44:03.096931 4975 generic.go:334] "Generic (PLEG): container finished" podID="84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e" containerID="15d8a047563c19f878e2071f8ba3ec0280a5f58e1dc9b03018dc7f7de18d246d" exitCode=0 Mar 18 12:44:03 crc kubenswrapper[4975]: I0318 12:44:03.097002 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563964-sp8fb" event={"ID":"84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e","Type":"ContainerDied","Data":"15d8a047563c19f878e2071f8ba3ec0280a5f58e1dc9b03018dc7f7de18d246d"} Mar 18 12:44:04 crc kubenswrapper[4975]: I0318 12:44:04.465707 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563964-sp8fb" Mar 18 12:44:04 crc kubenswrapper[4975]: I0318 12:44:04.632838 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlcx7\" (UniqueName: \"kubernetes.io/projected/84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e-kube-api-access-mlcx7\") pod \"84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e\" (UID: \"84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e\") " Mar 18 12:44:04 crc kubenswrapper[4975]: I0318 12:44:04.637768 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e-kube-api-access-mlcx7" (OuterVolumeSpecName: "kube-api-access-mlcx7") pod "84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e" (UID: "84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e"). InnerVolumeSpecName "kube-api-access-mlcx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:44:04 crc kubenswrapper[4975]: I0318 12:44:04.735501 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlcx7\" (UniqueName: \"kubernetes.io/projected/84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e-kube-api-access-mlcx7\") on node \"crc\" DevicePath \"\"" Mar 18 12:44:05 crc kubenswrapper[4975]: I0318 12:44:05.120705 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563964-sp8fb" event={"ID":"84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e","Type":"ContainerDied","Data":"09243f44db31bd9061d2d9e503158cb3d9be67ace8caa1af7f0ed75a47df0071"} Mar 18 12:44:05 crc kubenswrapper[4975]: I0318 12:44:05.120762 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09243f44db31bd9061d2d9e503158cb3d9be67ace8caa1af7f0ed75a47df0071" Mar 18 12:44:05 crc kubenswrapper[4975]: I0318 12:44:05.120783 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563964-sp8fb" Mar 18 12:44:05 crc kubenswrapper[4975]: I0318 12:44:05.564733 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563958-jnnzc"] Mar 18 12:44:05 crc kubenswrapper[4975]: I0318 12:44:05.574683 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563958-jnnzc"] Mar 18 12:44:07 crc kubenswrapper[4975]: I0318 12:44:07.031522 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3711f736-ca1c-46c8-950a-4472f3dbc6b9" path="/var/lib/kubelet/pods/3711f736-ca1c-46c8-950a-4472f3dbc6b9/volumes" Mar 18 12:44:08 crc kubenswrapper[4975]: I0318 12:44:08.053813 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-9bbxb"] Mar 18 12:44:08 crc kubenswrapper[4975]: I0318 12:44:08.073762 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-9bbxb"] Mar 18 12:44:09 crc kubenswrapper[4975]: I0318 12:44:09.028681 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85a1328-170d-4be7-8115-36d400fd7645" path="/var/lib/kubelet/pods/f85a1328-170d-4be7-8115-36d400fd7645/volumes" Mar 18 12:44:17 crc kubenswrapper[4975]: I0318 12:44:17.032436 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-hwtrk"] Mar 18 12:44:17 crc kubenswrapper[4975]: I0318 12:44:17.037810 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-hwtrk"] Mar 18 12:44:19 crc kubenswrapper[4975]: I0318 12:44:19.038040 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2614112-7379-4588-a6dd-1cec6e3d96b4" path="/var/lib/kubelet/pods/c2614112-7379-4588-a6dd-1cec6e3d96b4/volumes" Mar 18 12:44:20 crc kubenswrapper[4975]: I0318 12:44:20.030768 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wfp2q"] Mar 18 12:44:20 crc kubenswrapper[4975]: I0318 12:44:20.038209 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wfp2q"] Mar 18 12:44:21 crc kubenswrapper[4975]: I0318 12:44:21.041741 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3247b82c-5928-4a38-85aa-d37c7d8f6c21" path="/var/lib/kubelet/pods/3247b82c-5928-4a38-85aa-d37c7d8f6c21/volumes" Mar 18 12:44:25 crc kubenswrapper[4975]: I0318 12:44:25.538477 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:44:25 crc kubenswrapper[4975]: I0318 12:44:25.539202 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:44:30 crc kubenswrapper[4975]: I0318 12:44:30.060068 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-psw5k"] Mar 18 12:44:30 crc kubenswrapper[4975]: I0318 12:44:30.071728 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-psw5k"] Mar 18 12:44:31 crc kubenswrapper[4975]: I0318 12:44:31.041172 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f054ebc-151c-4e89-8242-3837c9bee6b2" path="/var/lib/kubelet/pods/4f054ebc-151c-4e89-8242-3837c9bee6b2/volumes" Mar 18 12:44:31 crc kubenswrapper[4975]: I0318 12:44:31.042455 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-m6lmd"] Mar 18 12:44:31 crc kubenswrapper[4975]: I0318 12:44:31.054431 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-m6lmd"] Mar 18 12:44:33 crc kubenswrapper[4975]: I0318 12:44:33.031601 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94f94b61-6738-4ab8-a65f-0d6cf4d86be1" path="/var/lib/kubelet/pods/94f94b61-6738-4ab8-a65f-0d6cf4d86be1/volumes" Mar 18 12:44:33 crc kubenswrapper[4975]: I0318 12:44:33.428670 4975 generic.go:334] "Generic (PLEG): container finished" podID="a56e99ba-eb18-4a6a-8347-564e7af719f7" containerID="aaf423f18ae17025d9cac4bcdd57dbd257b4848c92533e911f994303a4ce1ff8" exitCode=0 Mar 18 12:44:33 crc kubenswrapper[4975]: I0318 12:44:33.428733 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" event={"ID":"a56e99ba-eb18-4a6a-8347-564e7af719f7","Type":"ContainerDied","Data":"aaf423f18ae17025d9cac4bcdd57dbd257b4848c92533e911f994303a4ce1ff8"} Mar 18 12:44:34 crc kubenswrapper[4975]: I0318 12:44:34.960444 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.047761 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87jbc\" (UniqueName: \"kubernetes.io/projected/a56e99ba-eb18-4a6a-8347-564e7af719f7-kube-api-access-87jbc\") pod \"a56e99ba-eb18-4a6a-8347-564e7af719f7\" (UID: \"a56e99ba-eb18-4a6a-8347-564e7af719f7\") " Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.048135 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a56e99ba-eb18-4a6a-8347-564e7af719f7-inventory\") pod \"a56e99ba-eb18-4a6a-8347-564e7af719f7\" (UID: \"a56e99ba-eb18-4a6a-8347-564e7af719f7\") " Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.048215 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a56e99ba-eb18-4a6a-8347-564e7af719f7-ssh-key-openstack-edpm-ipam\") pod \"a56e99ba-eb18-4a6a-8347-564e7af719f7\" (UID: \"a56e99ba-eb18-4a6a-8347-564e7af719f7\") " Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.054223 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a56e99ba-eb18-4a6a-8347-564e7af719f7-kube-api-access-87jbc" (OuterVolumeSpecName: "kube-api-access-87jbc") pod "a56e99ba-eb18-4a6a-8347-564e7af719f7" (UID: "a56e99ba-eb18-4a6a-8347-564e7af719f7"). InnerVolumeSpecName "kube-api-access-87jbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.080683 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56e99ba-eb18-4a6a-8347-564e7af719f7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a56e99ba-eb18-4a6a-8347-564e7af719f7" (UID: "a56e99ba-eb18-4a6a-8347-564e7af719f7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.086214 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56e99ba-eb18-4a6a-8347-564e7af719f7-inventory" (OuterVolumeSpecName: "inventory") pod "a56e99ba-eb18-4a6a-8347-564e7af719f7" (UID: "a56e99ba-eb18-4a6a-8347-564e7af719f7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.151238 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87jbc\" (UniqueName: \"kubernetes.io/projected/a56e99ba-eb18-4a6a-8347-564e7af719f7-kube-api-access-87jbc\") on node \"crc\" DevicePath \"\"" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.151285 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a56e99ba-eb18-4a6a-8347-564e7af719f7-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.151300 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a56e99ba-eb18-4a6a-8347-564e7af719f7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.458895 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" event={"ID":"a56e99ba-eb18-4a6a-8347-564e7af719f7","Type":"ContainerDied","Data":"99c55378332d5150824d82d124c7b8f802e397b41f1f7f64542e211c031a6417"} Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.458967 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99c55378332d5150824d82d124c7b8f802e397b41f1f7f64542e211c031a6417" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.459042 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kckms" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.611182 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp"] Mar 18 12:44:35 crc kubenswrapper[4975]: E0318 12:44:35.611546 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e" containerName="oc" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.611563 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e" containerName="oc" Mar 18 12:44:35 crc kubenswrapper[4975]: E0318 12:44:35.611599 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56e99ba-eb18-4a6a-8347-564e7af719f7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.611607 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56e99ba-eb18-4a6a-8347-564e7af719f7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.611762 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="a56e99ba-eb18-4a6a-8347-564e7af719f7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.611788 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e" containerName="oc" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.612335 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.615174 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.616775 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.618935 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.621267 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.638993 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp"] Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.658737 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee1b246b-e80c-4d00-ab5e-013460f7e886-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49rnp\" (UID: \"ee1b246b-e80c-4d00-ab5e-013460f7e886\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.658947 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee1b246b-e80c-4d00-ab5e-013460f7e886-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49rnp\" (UID: \"ee1b246b-e80c-4d00-ab5e-013460f7e886\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.658985 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skj6z\" (UniqueName: \"kubernetes.io/projected/ee1b246b-e80c-4d00-ab5e-013460f7e886-kube-api-access-skj6z\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49rnp\" (UID: \"ee1b246b-e80c-4d00-ab5e-013460f7e886\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.760126 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee1b246b-e80c-4d00-ab5e-013460f7e886-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49rnp\" (UID: \"ee1b246b-e80c-4d00-ab5e-013460f7e886\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.760242 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skj6z\" (UniqueName: \"kubernetes.io/projected/ee1b246b-e80c-4d00-ab5e-013460f7e886-kube-api-access-skj6z\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49rnp\" (UID: \"ee1b246b-e80c-4d00-ab5e-013460f7e886\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.760307 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee1b246b-e80c-4d00-ab5e-013460f7e886-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49rnp\" (UID: \"ee1b246b-e80c-4d00-ab5e-013460f7e886\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.766391 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee1b246b-e80c-4d00-ab5e-013460f7e886-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49rnp\" (UID: \"ee1b246b-e80c-4d00-ab5e-013460f7e886\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.768196 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee1b246b-e80c-4d00-ab5e-013460f7e886-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49rnp\" (UID: \"ee1b246b-e80c-4d00-ab5e-013460f7e886\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.790892 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skj6z\" (UniqueName: \"kubernetes.io/projected/ee1b246b-e80c-4d00-ab5e-013460f7e886-kube-api-access-skj6z\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-49rnp\" (UID: \"ee1b246b-e80c-4d00-ab5e-013460f7e886\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" Mar 18 12:44:35 crc kubenswrapper[4975]: I0318 12:44:35.941281 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" Mar 18 12:44:36 crc kubenswrapper[4975]: I0318 12:44:36.556580 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp"] Mar 18 12:44:37 crc kubenswrapper[4975]: I0318 12:44:37.479149 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" event={"ID":"ee1b246b-e80c-4d00-ab5e-013460f7e886","Type":"ContainerStarted","Data":"ffb3ce43ad99f6225a3c1d5526d6b05f8fc71b4c512952336011023b3cab50c8"} Mar 18 12:44:38 crc kubenswrapper[4975]: I0318 12:44:38.510220 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" event={"ID":"ee1b246b-e80c-4d00-ab5e-013460f7e886","Type":"ContainerStarted","Data":"dfe883cdcfe6015b922ab7f8abff55f1852894d7206843e35176759d1e814f8d"} Mar 18 12:44:38 crc kubenswrapper[4975]: I0318 12:44:38.542731 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" podStartSLOduration=2.315259841 podStartE2EDuration="3.542701167s" podCreationTimestamp="2026-03-18 12:44:35 +0000 UTC" firstStartedPulling="2026-03-18 12:44:36.556404273 +0000 UTC m=+2062.270804852" lastFinishedPulling="2026-03-18 12:44:37.783845579 +0000 UTC m=+2063.498246178" observedRunningTime="2026-03-18 12:44:38.533229047 +0000 UTC m=+2064.247629646" watchObservedRunningTime="2026-03-18 12:44:38.542701167 +0000 UTC m=+2064.257101746" Mar 18 12:44:46 crc kubenswrapper[4975]: I0318 12:44:46.184921 4975 scope.go:117] "RemoveContainer" containerID="ad817110e62a67f7d915322c53393627cec21e6daa015f6acf407587979c543e" Mar 18 12:44:46 crc kubenswrapper[4975]: I0318 12:44:46.223340 4975 scope.go:117] "RemoveContainer" containerID="be07b1fb2431f75c390014aa13497026e975f154b46b324a8a5e1a634690363f" Mar 18 12:44:46 crc kubenswrapper[4975]: I0318 12:44:46.273822 4975 scope.go:117] "RemoveContainer" containerID="d7fd08ea9e8af2fce4a2d69ad6eefacff16e67cd19e08cf851e60b6cc4dbdb8f" Mar 18 12:44:46 crc kubenswrapper[4975]: I0318 12:44:46.315593 4975 scope.go:117] "RemoveContainer" containerID="fcf9119d0dd51cd281c1f3e4788c2924109567cae2489841d9b989aeb57f0389" Mar 18 12:44:46 crc kubenswrapper[4975]: I0318 12:44:46.376598 4975 scope.go:117] "RemoveContainer" containerID="3188344a6de3445fb78743c98a82e24f09976897da649a4069a44cbccedaec81" Mar 18 12:44:46 crc kubenswrapper[4975]: I0318 12:44:46.419170 4975 scope.go:117] "RemoveContainer" containerID="77eec9a0c105490c29fb8bed80b81fecf8fa74501cf1bb4d42fcb5f966879a84" Mar 18 12:44:55 crc kubenswrapper[4975]: I0318 12:44:55.539180 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:44:55 crc kubenswrapper[4975]: I0318 12:44:55.540162 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:44:55 crc kubenswrapper[4975]: I0318 12:44:55.540362 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:44:55 crc kubenswrapper[4975]: I0318 12:44:55.542428 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e1445fab85886838c0ceb2338548e8dbe5d869da1c9483fa6052e6d794e0e6d2"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:44:55 crc kubenswrapper[4975]: I0318 12:44:55.542610 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://e1445fab85886838c0ceb2338548e8dbe5d869da1c9483fa6052e6d794e0e6d2" gracePeriod=600 Mar 18 12:44:55 crc kubenswrapper[4975]: I0318 12:44:55.739110 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="e1445fab85886838c0ceb2338548e8dbe5d869da1c9483fa6052e6d794e0e6d2" exitCode=0 Mar 18 12:44:55 crc kubenswrapper[4975]: I0318 12:44:55.739207 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"e1445fab85886838c0ceb2338548e8dbe5d869da1c9483fa6052e6d794e0e6d2"} Mar 18 12:44:55 crc kubenswrapper[4975]: I0318 12:44:55.739288 4975 scope.go:117] "RemoveContainer" containerID="27d16d4ae7ec17a264fc611c848e5f27ed95812ab30c204dd0e7ea4e1c183aff" Mar 18 12:44:56 crc kubenswrapper[4975]: I0318 12:44:56.755112 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3"} Mar 18 12:45:00 crc kubenswrapper[4975]: I0318 12:45:00.187406 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz"] Mar 18 12:45:00 crc kubenswrapper[4975]: I0318 12:45:00.191086 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" Mar 18 12:45:00 crc kubenswrapper[4975]: I0318 12:45:00.194827 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 12:45:00 crc kubenswrapper[4975]: I0318 12:45:00.195251 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 12:45:00 crc kubenswrapper[4975]: I0318 12:45:00.215750 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz"] Mar 18 12:45:00 crc kubenswrapper[4975]: I0318 12:45:00.385531 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-config-volume\") pod \"collect-profiles-29563965-qlfwz\" (UID: \"f3c91b58-13e4-4d0a-93e8-0ec185187b3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" Mar 18 12:45:00 crc kubenswrapper[4975]: I0318 12:45:00.385812 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxwzr\" (UniqueName: \"kubernetes.io/projected/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-kube-api-access-zxwzr\") pod \"collect-profiles-29563965-qlfwz\" (UID: \"f3c91b58-13e4-4d0a-93e8-0ec185187b3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" Mar 18 12:45:00 crc kubenswrapper[4975]: I0318 12:45:00.385916 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-secret-volume\") pod \"collect-profiles-29563965-qlfwz\" (UID: \"f3c91b58-13e4-4d0a-93e8-0ec185187b3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" Mar 18 12:45:00 crc kubenswrapper[4975]: I0318 12:45:00.488513 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxwzr\" (UniqueName: \"kubernetes.io/projected/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-kube-api-access-zxwzr\") pod \"collect-profiles-29563965-qlfwz\" (UID: \"f3c91b58-13e4-4d0a-93e8-0ec185187b3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" Mar 18 12:45:00 crc kubenswrapper[4975]: I0318 12:45:00.488711 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-secret-volume\") pod \"collect-profiles-29563965-qlfwz\" (UID: \"f3c91b58-13e4-4d0a-93e8-0ec185187b3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" Mar 18 12:45:00 crc kubenswrapper[4975]: I0318 12:45:00.488946 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-config-volume\") pod \"collect-profiles-29563965-qlfwz\" (UID: \"f3c91b58-13e4-4d0a-93e8-0ec185187b3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" Mar 18 12:45:00 crc kubenswrapper[4975]: I0318 12:45:00.490492 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-config-volume\") pod \"collect-profiles-29563965-qlfwz\" (UID: \"f3c91b58-13e4-4d0a-93e8-0ec185187b3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" Mar 18 12:45:00 crc kubenswrapper[4975]: I0318 12:45:00.496590 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-secret-volume\") pod \"collect-profiles-29563965-qlfwz\" (UID: \"f3c91b58-13e4-4d0a-93e8-0ec185187b3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" Mar 18 12:45:00 crc kubenswrapper[4975]: I0318 12:45:00.525428 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxwzr\" (UniqueName: \"kubernetes.io/projected/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-kube-api-access-zxwzr\") pod \"collect-profiles-29563965-qlfwz\" (UID: \"f3c91b58-13e4-4d0a-93e8-0ec185187b3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" Mar 18 12:45:00 crc kubenswrapper[4975]: I0318 12:45:00.816427 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" Mar 18 12:45:01 crc kubenswrapper[4975]: I0318 12:45:01.348170 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz"] Mar 18 12:45:01 crc kubenswrapper[4975]: I0318 12:45:01.817393 4975 generic.go:334] "Generic (PLEG): container finished" podID="f3c91b58-13e4-4d0a-93e8-0ec185187b3a" containerID="39714d1784f3a47619d89f33efc5052d817d40dc0e83dfc0694859180108accd" exitCode=0 Mar 18 12:45:01 crc kubenswrapper[4975]: I0318 12:45:01.817450 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" event={"ID":"f3c91b58-13e4-4d0a-93e8-0ec185187b3a","Type":"ContainerDied","Data":"39714d1784f3a47619d89f33efc5052d817d40dc0e83dfc0694859180108accd"} Mar 18 12:45:01 crc kubenswrapper[4975]: I0318 12:45:01.817477 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" event={"ID":"f3c91b58-13e4-4d0a-93e8-0ec185187b3a","Type":"ContainerStarted","Data":"88145f4543fd2a9dea94849a56f7245677157ad0a6276b293fd23c2d190686bf"} Mar 18 12:45:03 crc kubenswrapper[4975]: I0318 12:45:03.209277 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" Mar 18 12:45:03 crc kubenswrapper[4975]: I0318 12:45:03.355981 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxwzr\" (UniqueName: \"kubernetes.io/projected/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-kube-api-access-zxwzr\") pod \"f3c91b58-13e4-4d0a-93e8-0ec185187b3a\" (UID: \"f3c91b58-13e4-4d0a-93e8-0ec185187b3a\") " Mar 18 12:45:03 crc kubenswrapper[4975]: I0318 12:45:03.356063 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-config-volume\") pod \"f3c91b58-13e4-4d0a-93e8-0ec185187b3a\" (UID: \"f3c91b58-13e4-4d0a-93e8-0ec185187b3a\") " Mar 18 12:45:03 crc kubenswrapper[4975]: I0318 12:45:03.356125 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-secret-volume\") pod \"f3c91b58-13e4-4d0a-93e8-0ec185187b3a\" (UID: \"f3c91b58-13e4-4d0a-93e8-0ec185187b3a\") " Mar 18 12:45:03 crc kubenswrapper[4975]: I0318 12:45:03.357026 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-config-volume" (OuterVolumeSpecName: "config-volume") pod "f3c91b58-13e4-4d0a-93e8-0ec185187b3a" (UID: "f3c91b58-13e4-4d0a-93e8-0ec185187b3a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:45:03 crc kubenswrapper[4975]: I0318 12:45:03.364349 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-kube-api-access-zxwzr" (OuterVolumeSpecName: "kube-api-access-zxwzr") pod "f3c91b58-13e4-4d0a-93e8-0ec185187b3a" (UID: "f3c91b58-13e4-4d0a-93e8-0ec185187b3a"). InnerVolumeSpecName "kube-api-access-zxwzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:45:03 crc kubenswrapper[4975]: I0318 12:45:03.365910 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f3c91b58-13e4-4d0a-93e8-0ec185187b3a" (UID: "f3c91b58-13e4-4d0a-93e8-0ec185187b3a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:45:03 crc kubenswrapper[4975]: I0318 12:45:03.459231 4975 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:03 crc kubenswrapper[4975]: I0318 12:45:03.459273 4975 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:03 crc kubenswrapper[4975]: I0318 12:45:03.459295 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxwzr\" (UniqueName: \"kubernetes.io/projected/f3c91b58-13e4-4d0a-93e8-0ec185187b3a-kube-api-access-zxwzr\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:03 crc kubenswrapper[4975]: I0318 12:45:03.843158 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" event={"ID":"f3c91b58-13e4-4d0a-93e8-0ec185187b3a","Type":"ContainerDied","Data":"88145f4543fd2a9dea94849a56f7245677157ad0a6276b293fd23c2d190686bf"} Mar 18 12:45:03 crc kubenswrapper[4975]: I0318 12:45:03.843516 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88145f4543fd2a9dea94849a56f7245677157ad0a6276b293fd23c2d190686bf" Mar 18 12:45:03 crc kubenswrapper[4975]: I0318 12:45:03.843257 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz" Mar 18 12:45:04 crc kubenswrapper[4975]: I0318 12:45:04.302381 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn"] Mar 18 12:45:04 crc kubenswrapper[4975]: I0318 12:45:04.311583 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563920-p22bn"] Mar 18 12:45:05 crc kubenswrapper[4975]: I0318 12:45:05.039366 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79bb26ca-fc97-4b59-ab56-12637c684208" path="/var/lib/kubelet/pods/79bb26ca-fc97-4b59-ab56-12637c684208/volumes" Mar 18 12:45:13 crc kubenswrapper[4975]: I0318 12:45:13.042056 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-76hpr"] Mar 18 12:45:13 crc kubenswrapper[4975]: I0318 12:45:13.049833 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hs8wx"] Mar 18 12:45:13 crc kubenswrapper[4975]: I0318 12:45:13.057224 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-w9tlx"] Mar 18 12:45:13 crc kubenswrapper[4975]: I0318 12:45:13.066812 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hs8wx"] Mar 18 12:45:13 crc kubenswrapper[4975]: I0318 12:45:13.074477 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-w9tlx"] Mar 18 12:45:13 crc kubenswrapper[4975]: I0318 12:45:13.081440 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-76hpr"] Mar 18 12:45:14 crc kubenswrapper[4975]: I0318 12:45:14.060207 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-d4d4-account-create-update-bzbn7"] Mar 18 12:45:14 crc kubenswrapper[4975]: I0318 12:45:14.075133 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db13-account-create-update-cmgxb"] Mar 18 12:45:14 crc kubenswrapper[4975]: I0318 12:45:14.082637 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c3a7-account-create-update-djb44"] Mar 18 12:45:14 crc kubenswrapper[4975]: I0318 12:45:14.089336 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db13-account-create-update-cmgxb"] Mar 18 12:45:14 crc kubenswrapper[4975]: I0318 12:45:14.095585 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-d4d4-account-create-update-bzbn7"] Mar 18 12:45:14 crc kubenswrapper[4975]: I0318 12:45:14.101652 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-c3a7-account-create-update-djb44"] Mar 18 12:45:15 crc kubenswrapper[4975]: I0318 12:45:15.041757 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7" path="/var/lib/kubelet/pods/110c1cfd-1a91-40bf-9fa3-d8cc5bb79cb7/volumes" Mar 18 12:45:15 crc kubenswrapper[4975]: I0318 12:45:15.043966 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22cd6088-9800-4d88-b130-fdc6a3dd4e90" path="/var/lib/kubelet/pods/22cd6088-9800-4d88-b130-fdc6a3dd4e90/volumes" Mar 18 12:45:15 crc kubenswrapper[4975]: I0318 12:45:15.046448 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f7b0dc3-44d1-4932-bca8-f4ade944ecbe" path="/var/lib/kubelet/pods/3f7b0dc3-44d1-4932-bca8-f4ade944ecbe/volumes" Mar 18 12:45:15 crc kubenswrapper[4975]: I0318 12:45:15.047977 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481acd0e-80e8-416a-aec5-361567fd5bc6" path="/var/lib/kubelet/pods/481acd0e-80e8-416a-aec5-361567fd5bc6/volumes" Mar 18 12:45:15 crc kubenswrapper[4975]: I0318 12:45:15.050096 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb70bc5f-531b-4c58-b84b-e8eb61d81340" path="/var/lib/kubelet/pods/eb70bc5f-531b-4c58-b84b-e8eb61d81340/volumes" Mar 18 12:45:15 crc kubenswrapper[4975]: I0318 12:45:15.050844 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3524634-0ee4-4add-a35b-a6ffdee6f1f6" path="/var/lib/kubelet/pods/f3524634-0ee4-4add-a35b-a6ffdee6f1f6/volumes" Mar 18 12:45:43 crc kubenswrapper[4975]: I0318 12:45:43.067385 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x74s8"] Mar 18 12:45:43 crc kubenswrapper[4975]: I0318 12:45:43.074483 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x74s8"] Mar 18 12:45:45 crc kubenswrapper[4975]: I0318 12:45:45.029169 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67cbe265-b297-4ad8-af53-703b8549d8e6" path="/var/lib/kubelet/pods/67cbe265-b297-4ad8-af53-703b8549d8e6/volumes" Mar 18 12:45:46 crc kubenswrapper[4975]: I0318 12:45:46.548675 4975 scope.go:117] "RemoveContainer" containerID="2027ac0f879d6bc658160a364d83c9822493eb86828b9ea3300924d8c2837379" Mar 18 12:45:46 crc kubenswrapper[4975]: I0318 12:45:46.616839 4975 scope.go:117] "RemoveContainer" containerID="d862336df0d2add7d6b7168aa9378d05c115f93f4389fe2f6e696f0b7b82996c" Mar 18 12:45:46 crc kubenswrapper[4975]: I0318 12:45:46.663395 4975 scope.go:117] "RemoveContainer" containerID="359ca5ac16a74f6c2cba2b24070a22d712744b29ede7976f9ebfac0912a6a95b" Mar 18 12:45:46 crc kubenswrapper[4975]: I0318 12:45:46.719961 4975 scope.go:117] "RemoveContainer" containerID="09198912bf9136533c9e0e45a2bf948ca2e18c60bff7f983503edb4275f2f94e" Mar 18 12:45:46 crc kubenswrapper[4975]: I0318 12:45:46.752156 4975 scope.go:117] "RemoveContainer" containerID="c71aa7bef889bc62ef542514ceaf515d7181e991e89e92e6d7a87de6dde539d6" Mar 18 12:45:46 crc kubenswrapper[4975]: I0318 12:45:46.804024 4975 scope.go:117] "RemoveContainer" containerID="abcded235c136b47c567608e1cfb19a7e24f9654c55e9aae98d0df6fe1079853" Mar 18 12:45:46 crc kubenswrapper[4975]: I0318 12:45:46.841170 4975 scope.go:117] "RemoveContainer" containerID="c24bc886fe76fee861d9e3784ac38bf79719db5373039449e0204aa012969f6a" Mar 18 12:45:46 crc kubenswrapper[4975]: I0318 12:45:46.858253 4975 scope.go:117] "RemoveContainer" containerID="34ebf7676cccf02b56a0c82eb244d1f089a7da2bb1ba8030bd159c48fc00ebde" Mar 18 12:45:53 crc kubenswrapper[4975]: E0318 12:45:53.500745 4975 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee1b246b_e80c_4d00_ab5e_013460f7e886.slice/crio-dfe883cdcfe6015b922ab7f8abff55f1852894d7206843e35176759d1e814f8d.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:45:57 crc kubenswrapper[4975]: I0318 12:45:57.394042 4975 generic.go:334] "Generic (PLEG): container finished" podID="ee1b246b-e80c-4d00-ab5e-013460f7e886" containerID="dfe883cdcfe6015b922ab7f8abff55f1852894d7206843e35176759d1e814f8d" exitCode=0 Mar 18 12:45:57 crc kubenswrapper[4975]: I0318 12:45:57.394140 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" event={"ID":"ee1b246b-e80c-4d00-ab5e-013460f7e886","Type":"ContainerDied","Data":"dfe883cdcfe6015b922ab7f8abff55f1852894d7206843e35176759d1e814f8d"} Mar 18 12:45:58 crc kubenswrapper[4975]: I0318 12:45:58.807180 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" Mar 18 12:45:58 crc kubenswrapper[4975]: I0318 12:45:58.852609 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee1b246b-e80c-4d00-ab5e-013460f7e886-inventory\") pod \"ee1b246b-e80c-4d00-ab5e-013460f7e886\" (UID: \"ee1b246b-e80c-4d00-ab5e-013460f7e886\") " Mar 18 12:45:58 crc kubenswrapper[4975]: I0318 12:45:58.852730 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skj6z\" (UniqueName: \"kubernetes.io/projected/ee1b246b-e80c-4d00-ab5e-013460f7e886-kube-api-access-skj6z\") pod \"ee1b246b-e80c-4d00-ab5e-013460f7e886\" (UID: \"ee1b246b-e80c-4d00-ab5e-013460f7e886\") " Mar 18 12:45:58 crc kubenswrapper[4975]: I0318 12:45:58.852876 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee1b246b-e80c-4d00-ab5e-013460f7e886-ssh-key-openstack-edpm-ipam\") pod \"ee1b246b-e80c-4d00-ab5e-013460f7e886\" (UID: \"ee1b246b-e80c-4d00-ab5e-013460f7e886\") " Mar 18 12:45:58 crc kubenswrapper[4975]: I0318 12:45:58.872102 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1b246b-e80c-4d00-ab5e-013460f7e886-kube-api-access-skj6z" (OuterVolumeSpecName: "kube-api-access-skj6z") pod "ee1b246b-e80c-4d00-ab5e-013460f7e886" (UID: "ee1b246b-e80c-4d00-ab5e-013460f7e886"). InnerVolumeSpecName "kube-api-access-skj6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:45:58 crc kubenswrapper[4975]: I0318 12:45:58.964812 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skj6z\" (UniqueName: \"kubernetes.io/projected/ee1b246b-e80c-4d00-ab5e-013460f7e886-kube-api-access-skj6z\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.002090 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1b246b-e80c-4d00-ab5e-013460f7e886-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ee1b246b-e80c-4d00-ab5e-013460f7e886" (UID: "ee1b246b-e80c-4d00-ab5e-013460f7e886"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.002446 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1b246b-e80c-4d00-ab5e-013460f7e886-inventory" (OuterVolumeSpecName: "inventory") pod "ee1b246b-e80c-4d00-ab5e-013460f7e886" (UID: "ee1b246b-e80c-4d00-ab5e-013460f7e886"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.067959 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee1b246b-e80c-4d00-ab5e-013460f7e886-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.068140 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee1b246b-e80c-4d00-ab5e-013460f7e886-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.412705 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" event={"ID":"ee1b246b-e80c-4d00-ab5e-013460f7e886","Type":"ContainerDied","Data":"ffb3ce43ad99f6225a3c1d5526d6b05f8fc71b4c512952336011023b3cab50c8"} Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.412745 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffb3ce43ad99f6225a3c1d5526d6b05f8fc71b4c512952336011023b3cab50c8" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.412770 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-49rnp" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.526336 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw"] Mar 18 12:45:59 crc kubenswrapper[4975]: E0318 12:45:59.526727 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c91b58-13e4-4d0a-93e8-0ec185187b3a" containerName="collect-profiles" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.526744 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c91b58-13e4-4d0a-93e8-0ec185187b3a" containerName="collect-profiles" Mar 18 12:45:59 crc kubenswrapper[4975]: E0318 12:45:59.526788 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1b246b-e80c-4d00-ab5e-013460f7e886" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.526797 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1b246b-e80c-4d00-ab5e-013460f7e886" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.527006 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1b246b-e80c-4d00-ab5e-013460f7e886" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.527036 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c91b58-13e4-4d0a-93e8-0ec185187b3a" containerName="collect-profiles" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.527631 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.529818 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.530067 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.533598 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.534891 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.546505 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw"] Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.577117 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtxkv\" (UniqueName: \"kubernetes.io/projected/48982599-8240-4f82-8932-5ded98c54dfa-kube-api-access-jtxkv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw\" (UID: \"48982599-8240-4f82-8932-5ded98c54dfa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.577275 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48982599-8240-4f82-8932-5ded98c54dfa-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw\" (UID: \"48982599-8240-4f82-8932-5ded98c54dfa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.577310 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48982599-8240-4f82-8932-5ded98c54dfa-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw\" (UID: \"48982599-8240-4f82-8932-5ded98c54dfa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.678879 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48982599-8240-4f82-8932-5ded98c54dfa-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw\" (UID: \"48982599-8240-4f82-8932-5ded98c54dfa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.678927 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48982599-8240-4f82-8932-5ded98c54dfa-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw\" (UID: \"48982599-8240-4f82-8932-5ded98c54dfa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.679028 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtxkv\" (UniqueName: \"kubernetes.io/projected/48982599-8240-4f82-8932-5ded98c54dfa-kube-api-access-jtxkv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw\" (UID: \"48982599-8240-4f82-8932-5ded98c54dfa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.684153 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48982599-8240-4f82-8932-5ded98c54dfa-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw\" (UID: \"48982599-8240-4f82-8932-5ded98c54dfa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.688696 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48982599-8240-4f82-8932-5ded98c54dfa-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw\" (UID: \"48982599-8240-4f82-8932-5ded98c54dfa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.696314 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtxkv\" (UniqueName: \"kubernetes.io/projected/48982599-8240-4f82-8932-5ded98c54dfa-kube-api-access-jtxkv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw\" (UID: \"48982599-8240-4f82-8932-5ded98c54dfa\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" Mar 18 12:45:59 crc kubenswrapper[4975]: I0318 12:45:59.861537 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" Mar 18 12:46:00 crc kubenswrapper[4975]: I0318 12:46:00.143342 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563966-g2gkq"] Mar 18 12:46:00 crc kubenswrapper[4975]: I0318 12:46:00.145348 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563966-g2gkq" Mar 18 12:46:00 crc kubenswrapper[4975]: I0318 12:46:00.154285 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:46:00 crc kubenswrapper[4975]: I0318 12:46:00.154405 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:46:00 crc kubenswrapper[4975]: I0318 12:46:00.154284 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:46:00 crc kubenswrapper[4975]: I0318 12:46:00.156081 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563966-g2gkq"] Mar 18 12:46:00 crc kubenswrapper[4975]: I0318 12:46:00.283582 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw"] Mar 18 12:46:00 crc kubenswrapper[4975]: I0318 12:46:00.293569 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl7qp\" (UniqueName: \"kubernetes.io/projected/fcb54f59-ccc7-48db-a359-7eb65c270001-kube-api-access-tl7qp\") pod \"auto-csr-approver-29563966-g2gkq\" (UID: \"fcb54f59-ccc7-48db-a359-7eb65c270001\") " pod="openshift-infra/auto-csr-approver-29563966-g2gkq" Mar 18 12:46:00 crc kubenswrapper[4975]: I0318 12:46:00.395537 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl7qp\" (UniqueName: \"kubernetes.io/projected/fcb54f59-ccc7-48db-a359-7eb65c270001-kube-api-access-tl7qp\") pod \"auto-csr-approver-29563966-g2gkq\" (UID: \"fcb54f59-ccc7-48db-a359-7eb65c270001\") " pod="openshift-infra/auto-csr-approver-29563966-g2gkq" Mar 18 12:46:00 crc kubenswrapper[4975]: I0318 12:46:00.415734 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl7qp\" (UniqueName: \"kubernetes.io/projected/fcb54f59-ccc7-48db-a359-7eb65c270001-kube-api-access-tl7qp\") pod \"auto-csr-approver-29563966-g2gkq\" (UID: \"fcb54f59-ccc7-48db-a359-7eb65c270001\") " pod="openshift-infra/auto-csr-approver-29563966-g2gkq" Mar 18 12:46:00 crc kubenswrapper[4975]: I0318 12:46:00.421784 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" event={"ID":"48982599-8240-4f82-8932-5ded98c54dfa","Type":"ContainerStarted","Data":"89e62302542bd74cc1b94600ba190e06def47f30a8c55f5b7bf25d4d1a4e45bd"} Mar 18 12:46:00 crc kubenswrapper[4975]: I0318 12:46:00.471096 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563966-g2gkq" Mar 18 12:46:00 crc kubenswrapper[4975]: I0318 12:46:00.922409 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563966-g2gkq"] Mar 18 12:46:01 crc kubenswrapper[4975]: I0318 12:46:01.439663 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563966-g2gkq" event={"ID":"fcb54f59-ccc7-48db-a359-7eb65c270001","Type":"ContainerStarted","Data":"eb363a544225b7417fb2c3cae7e0baae256051671210fa85fad87f2394005ad0"} Mar 18 12:46:01 crc kubenswrapper[4975]: I0318 12:46:01.442778 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" event={"ID":"48982599-8240-4f82-8932-5ded98c54dfa","Type":"ContainerStarted","Data":"a2fd9f67cfb70e255834a72488823e525cbc399d46265a7ebe40590bfb300462"} Mar 18 12:46:01 crc kubenswrapper[4975]: I0318 12:46:01.473124 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" podStartSLOduration=2.301311378 podStartE2EDuration="2.473050397s" podCreationTimestamp="2026-03-18 12:45:59 +0000 UTC" firstStartedPulling="2026-03-18 12:46:00.295698152 +0000 UTC m=+2146.010098751" lastFinishedPulling="2026-03-18 12:46:00.467437191 +0000 UTC m=+2146.181837770" observedRunningTime="2026-03-18 12:46:01.470727913 +0000 UTC m=+2147.185128492" watchObservedRunningTime="2026-03-18 12:46:01.473050397 +0000 UTC m=+2147.187451006" Mar 18 12:46:02 crc kubenswrapper[4975]: I0318 12:46:02.455396 4975 generic.go:334] "Generic (PLEG): container finished" podID="fcb54f59-ccc7-48db-a359-7eb65c270001" containerID="6a5f5008b343d134c1e07a951ef72843112e2d6a6f79b89508a2639e6d63f409" exitCode=0 Mar 18 12:46:02 crc kubenswrapper[4975]: I0318 12:46:02.455497 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563966-g2gkq" event={"ID":"fcb54f59-ccc7-48db-a359-7eb65c270001","Type":"ContainerDied","Data":"6a5f5008b343d134c1e07a951ef72843112e2d6a6f79b89508a2639e6d63f409"} Mar 18 12:46:03 crc kubenswrapper[4975]: I0318 12:46:03.843085 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563966-g2gkq" Mar 18 12:46:03 crc kubenswrapper[4975]: I0318 12:46:03.962755 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl7qp\" (UniqueName: \"kubernetes.io/projected/fcb54f59-ccc7-48db-a359-7eb65c270001-kube-api-access-tl7qp\") pod \"fcb54f59-ccc7-48db-a359-7eb65c270001\" (UID: \"fcb54f59-ccc7-48db-a359-7eb65c270001\") " Mar 18 12:46:03 crc kubenswrapper[4975]: I0318 12:46:03.969177 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb54f59-ccc7-48db-a359-7eb65c270001-kube-api-access-tl7qp" (OuterVolumeSpecName: "kube-api-access-tl7qp") pod "fcb54f59-ccc7-48db-a359-7eb65c270001" (UID: "fcb54f59-ccc7-48db-a359-7eb65c270001"). InnerVolumeSpecName "kube-api-access-tl7qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:46:04 crc kubenswrapper[4975]: I0318 12:46:04.064726 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl7qp\" (UniqueName: \"kubernetes.io/projected/fcb54f59-ccc7-48db-a359-7eb65c270001-kube-api-access-tl7qp\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:04 crc kubenswrapper[4975]: I0318 12:46:04.477365 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563966-g2gkq" event={"ID":"fcb54f59-ccc7-48db-a359-7eb65c270001","Type":"ContainerDied","Data":"eb363a544225b7417fb2c3cae7e0baae256051671210fa85fad87f2394005ad0"} Mar 18 12:46:04 crc kubenswrapper[4975]: I0318 12:46:04.477937 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb363a544225b7417fb2c3cae7e0baae256051671210fa85fad87f2394005ad0" Mar 18 12:46:04 crc kubenswrapper[4975]: I0318 12:46:04.477426 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563966-g2gkq" Mar 18 12:46:04 crc kubenswrapper[4975]: I0318 12:46:04.928113 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563960-6mgcg"] Mar 18 12:46:04 crc kubenswrapper[4975]: I0318 12:46:04.937463 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563960-6mgcg"] Mar 18 12:46:05 crc kubenswrapper[4975]: I0318 12:46:05.029564 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a9093ac-3b95-4bca-9455-ccf2768d9da2" path="/var/lib/kubelet/pods/7a9093ac-3b95-4bca-9455-ccf2768d9da2/volumes" Mar 18 12:46:05 crc kubenswrapper[4975]: I0318 12:46:05.490114 4975 generic.go:334] "Generic (PLEG): container finished" podID="48982599-8240-4f82-8932-5ded98c54dfa" containerID="a2fd9f67cfb70e255834a72488823e525cbc399d46265a7ebe40590bfb300462" exitCode=0 Mar 18 12:46:05 crc kubenswrapper[4975]: I0318 12:46:05.490167 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" event={"ID":"48982599-8240-4f82-8932-5ded98c54dfa","Type":"ContainerDied","Data":"a2fd9f67cfb70e255834a72488823e525cbc399d46265a7ebe40590bfb300462"} Mar 18 12:46:06 crc kubenswrapper[4975]: I0318 12:46:06.962045 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.023961 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48982599-8240-4f82-8932-5ded98c54dfa-inventory\") pod \"48982599-8240-4f82-8932-5ded98c54dfa\" (UID: \"48982599-8240-4f82-8932-5ded98c54dfa\") " Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.024090 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtxkv\" (UniqueName: \"kubernetes.io/projected/48982599-8240-4f82-8932-5ded98c54dfa-kube-api-access-jtxkv\") pod \"48982599-8240-4f82-8932-5ded98c54dfa\" (UID: \"48982599-8240-4f82-8932-5ded98c54dfa\") " Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.024290 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48982599-8240-4f82-8932-5ded98c54dfa-ssh-key-openstack-edpm-ipam\") pod \"48982599-8240-4f82-8932-5ded98c54dfa\" (UID: \"48982599-8240-4f82-8932-5ded98c54dfa\") " Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.034225 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48982599-8240-4f82-8932-5ded98c54dfa-kube-api-access-jtxkv" (OuterVolumeSpecName: "kube-api-access-jtxkv") pod "48982599-8240-4f82-8932-5ded98c54dfa" (UID: "48982599-8240-4f82-8932-5ded98c54dfa"). InnerVolumeSpecName "kube-api-access-jtxkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.064913 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48982599-8240-4f82-8932-5ded98c54dfa-inventory" (OuterVolumeSpecName: "inventory") pod "48982599-8240-4f82-8932-5ded98c54dfa" (UID: "48982599-8240-4f82-8932-5ded98c54dfa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.069970 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48982599-8240-4f82-8932-5ded98c54dfa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "48982599-8240-4f82-8932-5ded98c54dfa" (UID: "48982599-8240-4f82-8932-5ded98c54dfa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.127080 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48982599-8240-4f82-8932-5ded98c54dfa-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.127116 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtxkv\" (UniqueName: \"kubernetes.io/projected/48982599-8240-4f82-8932-5ded98c54dfa-kube-api-access-jtxkv\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.127131 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48982599-8240-4f82-8932-5ded98c54dfa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.515692 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" event={"ID":"48982599-8240-4f82-8932-5ded98c54dfa","Type":"ContainerDied","Data":"89e62302542bd74cc1b94600ba190e06def47f30a8c55f5b7bf25d4d1a4e45bd"} Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.515781 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89e62302542bd74cc1b94600ba190e06def47f30a8c55f5b7bf25d4d1a4e45bd" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.515933 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.623039 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn"] Mar 18 12:46:07 crc kubenswrapper[4975]: E0318 12:46:07.623536 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb54f59-ccc7-48db-a359-7eb65c270001" containerName="oc" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.623558 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb54f59-ccc7-48db-a359-7eb65c270001" containerName="oc" Mar 18 12:46:07 crc kubenswrapper[4975]: E0318 12:46:07.623596 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48982599-8240-4f82-8932-5ded98c54dfa" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.623608 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="48982599-8240-4f82-8932-5ded98c54dfa" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.623828 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb54f59-ccc7-48db-a359-7eb65c270001" containerName="oc" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.623855 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="48982599-8240-4f82-8932-5ded98c54dfa" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.624500 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.626329 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.626630 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.626772 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.627634 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.634564 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn"] Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.736805 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp2dc\" (UniqueName: \"kubernetes.io/projected/7b64debe-3d0c-4e5f-b45a-69c21b528483-kube-api-access-hp2dc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t9nfn\" (UID: \"7b64debe-3d0c-4e5f-b45a-69c21b528483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.736909 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b64debe-3d0c-4e5f-b45a-69c21b528483-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t9nfn\" (UID: \"7b64debe-3d0c-4e5f-b45a-69c21b528483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.736972 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b64debe-3d0c-4e5f-b45a-69c21b528483-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t9nfn\" (UID: \"7b64debe-3d0c-4e5f-b45a-69c21b528483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.838178 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp2dc\" (UniqueName: \"kubernetes.io/projected/7b64debe-3d0c-4e5f-b45a-69c21b528483-kube-api-access-hp2dc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t9nfn\" (UID: \"7b64debe-3d0c-4e5f-b45a-69c21b528483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.838298 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b64debe-3d0c-4e5f-b45a-69c21b528483-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t9nfn\" (UID: \"7b64debe-3d0c-4e5f-b45a-69c21b528483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.838369 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b64debe-3d0c-4e5f-b45a-69c21b528483-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t9nfn\" (UID: \"7b64debe-3d0c-4e5f-b45a-69c21b528483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.844461 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b64debe-3d0c-4e5f-b45a-69c21b528483-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t9nfn\" (UID: \"7b64debe-3d0c-4e5f-b45a-69c21b528483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.844718 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b64debe-3d0c-4e5f-b45a-69c21b528483-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t9nfn\" (UID: \"7b64debe-3d0c-4e5f-b45a-69c21b528483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" Mar 18 12:46:07 crc kubenswrapper[4975]: I0318 12:46:07.857926 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp2dc\" (UniqueName: \"kubernetes.io/projected/7b64debe-3d0c-4e5f-b45a-69c21b528483-kube-api-access-hp2dc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t9nfn\" (UID: \"7b64debe-3d0c-4e5f-b45a-69c21b528483\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" Mar 18 12:46:08 crc kubenswrapper[4975]: I0318 12:46:08.001761 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" Mar 18 12:46:08 crc kubenswrapper[4975]: I0318 12:46:08.051992 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-x6ztn"] Mar 18 12:46:08 crc kubenswrapper[4975]: I0318 12:46:08.061293 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-x6ztn"] Mar 18 12:46:08 crc kubenswrapper[4975]: I0318 12:46:08.551661 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn"] Mar 18 12:46:09 crc kubenswrapper[4975]: I0318 12:46:09.060587 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b1267e7-5d6a-43f3-aa6b-e864972b60f6" path="/var/lib/kubelet/pods/1b1267e7-5d6a-43f3-aa6b-e864972b60f6/volumes" Mar 18 12:46:09 crc kubenswrapper[4975]: I0318 12:46:09.061502 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c2rjz"] Mar 18 12:46:09 crc kubenswrapper[4975]: I0318 12:46:09.072122 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c2rjz"] Mar 18 12:46:09 crc kubenswrapper[4975]: I0318 12:46:09.539266 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" event={"ID":"7b64debe-3d0c-4e5f-b45a-69c21b528483","Type":"ContainerStarted","Data":"20c45fa77c6aacc83b6b218bd9f5513927d61c4652e7747faeb99e3aa6eb23fd"} Mar 18 12:46:09 crc kubenswrapper[4975]: I0318 12:46:09.539312 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" event={"ID":"7b64debe-3d0c-4e5f-b45a-69c21b528483","Type":"ContainerStarted","Data":"42c55d12cffda7bedfb581b5b583c51d0f091996554ab4796f888206b41fc538"} Mar 18 12:46:09 crc kubenswrapper[4975]: I0318 12:46:09.566981 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" podStartSLOduration=2.409887214 podStartE2EDuration="2.566951351s" podCreationTimestamp="2026-03-18 12:46:07 +0000 UTC" firstStartedPulling="2026-03-18 12:46:08.554302851 +0000 UTC m=+2154.268703430" lastFinishedPulling="2026-03-18 12:46:08.711366988 +0000 UTC m=+2154.425767567" observedRunningTime="2026-03-18 12:46:09.553746418 +0000 UTC m=+2155.268146997" watchObservedRunningTime="2026-03-18 12:46:09.566951351 +0000 UTC m=+2155.281351960" Mar 18 12:46:11 crc kubenswrapper[4975]: I0318 12:46:11.029695 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bba9a490-9803-4bc8-a775-1d8711fd3b41" path="/var/lib/kubelet/pods/bba9a490-9803-4bc8-a775-1d8711fd3b41/volumes" Mar 18 12:46:46 crc kubenswrapper[4975]: I0318 12:46:46.994253 4975 scope.go:117] "RemoveContainer" containerID="a142026bbb15be57597ac55d56a54c57db2e165d7de332634d1a83cf4f8cfbc4" Mar 18 12:46:47 crc kubenswrapper[4975]: I0318 12:46:47.040241 4975 scope.go:117] "RemoveContainer" containerID="fb7f4f09936e5e6dc05002d41147fc0603b7a6cec9299ef6692fa873c5ffc049" Mar 18 12:46:47 crc kubenswrapper[4975]: I0318 12:46:47.089099 4975 scope.go:117] "RemoveContainer" containerID="17303a59b3af9e610f1de5900d220c97ff01518d52ba803677b9a012c93e84fb" Mar 18 12:46:51 crc kubenswrapper[4975]: I0318 12:46:51.925281 4975 generic.go:334] "Generic (PLEG): container finished" podID="7b64debe-3d0c-4e5f-b45a-69c21b528483" containerID="20c45fa77c6aacc83b6b218bd9f5513927d61c4652e7747faeb99e3aa6eb23fd" exitCode=0 Mar 18 12:46:51 crc kubenswrapper[4975]: I0318 12:46:51.925415 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" event={"ID":"7b64debe-3d0c-4e5f-b45a-69c21b528483","Type":"ContainerDied","Data":"20c45fa77c6aacc83b6b218bd9f5513927d61c4652e7747faeb99e3aa6eb23fd"} Mar 18 12:46:52 crc kubenswrapper[4975]: I0318 12:46:52.059716 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6frd5"] Mar 18 12:46:52 crc kubenswrapper[4975]: I0318 12:46:52.069812 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6frd5"] Mar 18 12:46:53 crc kubenswrapper[4975]: I0318 12:46:53.028213 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44e07ebf-56a4-49eb-a2fe-61185a762c7b" path="/var/lib/kubelet/pods/44e07ebf-56a4-49eb-a2fe-61185a762c7b/volumes" Mar 18 12:46:53 crc kubenswrapper[4975]: I0318 12:46:53.368703 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" Mar 18 12:46:53 crc kubenswrapper[4975]: I0318 12:46:53.514116 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp2dc\" (UniqueName: \"kubernetes.io/projected/7b64debe-3d0c-4e5f-b45a-69c21b528483-kube-api-access-hp2dc\") pod \"7b64debe-3d0c-4e5f-b45a-69c21b528483\" (UID: \"7b64debe-3d0c-4e5f-b45a-69c21b528483\") " Mar 18 12:46:53 crc kubenswrapper[4975]: I0318 12:46:53.514279 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b64debe-3d0c-4e5f-b45a-69c21b528483-ssh-key-openstack-edpm-ipam\") pod \"7b64debe-3d0c-4e5f-b45a-69c21b528483\" (UID: \"7b64debe-3d0c-4e5f-b45a-69c21b528483\") " Mar 18 12:46:53 crc kubenswrapper[4975]: I0318 12:46:53.514296 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b64debe-3d0c-4e5f-b45a-69c21b528483-inventory\") pod \"7b64debe-3d0c-4e5f-b45a-69c21b528483\" (UID: \"7b64debe-3d0c-4e5f-b45a-69c21b528483\") " Mar 18 12:46:53 crc kubenswrapper[4975]: I0318 12:46:53.520273 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b64debe-3d0c-4e5f-b45a-69c21b528483-kube-api-access-hp2dc" (OuterVolumeSpecName: "kube-api-access-hp2dc") pod "7b64debe-3d0c-4e5f-b45a-69c21b528483" (UID: "7b64debe-3d0c-4e5f-b45a-69c21b528483"). InnerVolumeSpecName "kube-api-access-hp2dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:46:53 crc kubenswrapper[4975]: I0318 12:46:53.549642 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b64debe-3d0c-4e5f-b45a-69c21b528483-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7b64debe-3d0c-4e5f-b45a-69c21b528483" (UID: "7b64debe-3d0c-4e5f-b45a-69c21b528483"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:46:53 crc kubenswrapper[4975]: I0318 12:46:53.553059 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b64debe-3d0c-4e5f-b45a-69c21b528483-inventory" (OuterVolumeSpecName: "inventory") pod "7b64debe-3d0c-4e5f-b45a-69c21b528483" (UID: "7b64debe-3d0c-4e5f-b45a-69c21b528483"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:46:53 crc kubenswrapper[4975]: I0318 12:46:53.616908 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp2dc\" (UniqueName: \"kubernetes.io/projected/7b64debe-3d0c-4e5f-b45a-69c21b528483-kube-api-access-hp2dc\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:53 crc kubenswrapper[4975]: I0318 12:46:53.617089 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b64debe-3d0c-4e5f-b45a-69c21b528483-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:53 crc kubenswrapper[4975]: I0318 12:46:53.617174 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b64debe-3d0c-4e5f-b45a-69c21b528483-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:46:53 crc kubenswrapper[4975]: I0318 12:46:53.949021 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" event={"ID":"7b64debe-3d0c-4e5f-b45a-69c21b528483","Type":"ContainerDied","Data":"42c55d12cffda7bedfb581b5b583c51d0f091996554ab4796f888206b41fc538"} Mar 18 12:46:53 crc kubenswrapper[4975]: I0318 12:46:53.949464 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42c55d12cffda7bedfb581b5b583c51d0f091996554ab4796f888206b41fc538" Mar 18 12:46:53 crc kubenswrapper[4975]: I0318 12:46:53.949081 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t9nfn" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.117851 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg"] Mar 18 12:46:54 crc kubenswrapper[4975]: E0318 12:46:54.118334 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b64debe-3d0c-4e5f-b45a-69c21b528483" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.118350 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b64debe-3d0c-4e5f-b45a-69c21b528483" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.118618 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b64debe-3d0c-4e5f-b45a-69c21b528483" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.119436 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.123097 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.123267 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.123575 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.123586 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.127932 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg"] Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.227494 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5cb9baa-6a68-426f-88b4-4cb896f260e7-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f44vg\" (UID: \"d5cb9baa-6a68-426f-88b4-4cb896f260e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.227542 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5cb9baa-6a68-426f-88b4-4cb896f260e7-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f44vg\" (UID: \"d5cb9baa-6a68-426f-88b4-4cb896f260e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.227953 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzqcg\" (UniqueName: \"kubernetes.io/projected/d5cb9baa-6a68-426f-88b4-4cb896f260e7-kube-api-access-jzqcg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f44vg\" (UID: \"d5cb9baa-6a68-426f-88b4-4cb896f260e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.330038 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzqcg\" (UniqueName: \"kubernetes.io/projected/d5cb9baa-6a68-426f-88b4-4cb896f260e7-kube-api-access-jzqcg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f44vg\" (UID: \"d5cb9baa-6a68-426f-88b4-4cb896f260e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.330108 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5cb9baa-6a68-426f-88b4-4cb896f260e7-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f44vg\" (UID: \"d5cb9baa-6a68-426f-88b4-4cb896f260e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.330134 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5cb9baa-6a68-426f-88b4-4cb896f260e7-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f44vg\" (UID: \"d5cb9baa-6a68-426f-88b4-4cb896f260e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.344776 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5cb9baa-6a68-426f-88b4-4cb896f260e7-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f44vg\" (UID: \"d5cb9baa-6a68-426f-88b4-4cb896f260e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.346595 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5cb9baa-6a68-426f-88b4-4cb896f260e7-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f44vg\" (UID: \"d5cb9baa-6a68-426f-88b4-4cb896f260e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.347505 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzqcg\" (UniqueName: \"kubernetes.io/projected/d5cb9baa-6a68-426f-88b4-4cb896f260e7-kube-api-access-jzqcg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f44vg\" (UID: \"d5cb9baa-6a68-426f-88b4-4cb896f260e7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" Mar 18 12:46:54 crc kubenswrapper[4975]: I0318 12:46:54.487521 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" Mar 18 12:46:55 crc kubenswrapper[4975]: I0318 12:46:55.034437 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:46:55 crc kubenswrapper[4975]: I0318 12:46:55.047156 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg"] Mar 18 12:46:55 crc kubenswrapper[4975]: I0318 12:46:55.538991 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:46:55 crc kubenswrapper[4975]: I0318 12:46:55.539421 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:46:55 crc kubenswrapper[4975]: I0318 12:46:55.966730 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" event={"ID":"d5cb9baa-6a68-426f-88b4-4cb896f260e7","Type":"ContainerStarted","Data":"6a59b51766e429e2577f00a28287d9f6437f848cfab82adf95a9e63197f16435"} Mar 18 12:46:55 crc kubenswrapper[4975]: I0318 12:46:55.966773 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" event={"ID":"d5cb9baa-6a68-426f-88b4-4cb896f260e7","Type":"ContainerStarted","Data":"e25e76569725769faffda61c90d89df7c9c02b8304d6f26fb95aff914603a1b2"} Mar 18 12:46:55 crc kubenswrapper[4975]: I0318 12:46:55.988550 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" podStartSLOduration=1.794113528 podStartE2EDuration="1.988509369s" podCreationTimestamp="2026-03-18 12:46:54 +0000 UTC" firstStartedPulling="2026-03-18 12:46:55.034160638 +0000 UTC m=+2200.748561207" lastFinishedPulling="2026-03-18 12:46:55.228556469 +0000 UTC m=+2200.942957048" observedRunningTime="2026-03-18 12:46:55.979486572 +0000 UTC m=+2201.693887161" watchObservedRunningTime="2026-03-18 12:46:55.988509369 +0000 UTC m=+2201.702909948" Mar 18 12:47:25 crc kubenswrapper[4975]: I0318 12:47:25.539055 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:47:25 crc kubenswrapper[4975]: I0318 12:47:25.539974 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:47:46 crc kubenswrapper[4975]: I0318 12:47:46.455974 4975 generic.go:334] "Generic (PLEG): container finished" podID="d5cb9baa-6a68-426f-88b4-4cb896f260e7" containerID="6a59b51766e429e2577f00a28287d9f6437f848cfab82adf95a9e63197f16435" exitCode=0 Mar 18 12:47:46 crc kubenswrapper[4975]: I0318 12:47:46.456085 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" event={"ID":"d5cb9baa-6a68-426f-88b4-4cb896f260e7","Type":"ContainerDied","Data":"6a59b51766e429e2577f00a28287d9f6437f848cfab82adf95a9e63197f16435"} Mar 18 12:47:47 crc kubenswrapper[4975]: I0318 12:47:47.182552 4975 scope.go:117] "RemoveContainer" containerID="166ab5797c69ef8c0c3384dc550fff2a51a45219226508f65e6dc01ac953ab3a" Mar 18 12:47:47 crc kubenswrapper[4975]: I0318 12:47:47.971169 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.117146 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5cb9baa-6a68-426f-88b4-4cb896f260e7-ssh-key-openstack-edpm-ipam\") pod \"d5cb9baa-6a68-426f-88b4-4cb896f260e7\" (UID: \"d5cb9baa-6a68-426f-88b4-4cb896f260e7\") " Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.117528 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5cb9baa-6a68-426f-88b4-4cb896f260e7-inventory\") pod \"d5cb9baa-6a68-426f-88b4-4cb896f260e7\" (UID: \"d5cb9baa-6a68-426f-88b4-4cb896f260e7\") " Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.117734 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzqcg\" (UniqueName: \"kubernetes.io/projected/d5cb9baa-6a68-426f-88b4-4cb896f260e7-kube-api-access-jzqcg\") pod \"d5cb9baa-6a68-426f-88b4-4cb896f260e7\" (UID: \"d5cb9baa-6a68-426f-88b4-4cb896f260e7\") " Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.125663 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5cb9baa-6a68-426f-88b4-4cb896f260e7-kube-api-access-jzqcg" (OuterVolumeSpecName: "kube-api-access-jzqcg") pod "d5cb9baa-6a68-426f-88b4-4cb896f260e7" (UID: "d5cb9baa-6a68-426f-88b4-4cb896f260e7"). InnerVolumeSpecName "kube-api-access-jzqcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.145065 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5cb9baa-6a68-426f-88b4-4cb896f260e7-inventory" (OuterVolumeSpecName: "inventory") pod "d5cb9baa-6a68-426f-88b4-4cb896f260e7" (UID: "d5cb9baa-6a68-426f-88b4-4cb896f260e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.149821 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5cb9baa-6a68-426f-88b4-4cb896f260e7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d5cb9baa-6a68-426f-88b4-4cb896f260e7" (UID: "d5cb9baa-6a68-426f-88b4-4cb896f260e7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.219900 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5cb9baa-6a68-426f-88b4-4cb896f260e7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.219931 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5cb9baa-6a68-426f-88b4-4cb896f260e7-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.219941 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzqcg\" (UniqueName: \"kubernetes.io/projected/d5cb9baa-6a68-426f-88b4-4cb896f260e7-kube-api-access-jzqcg\") on node \"crc\" DevicePath \"\"" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.481217 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" event={"ID":"d5cb9baa-6a68-426f-88b4-4cb896f260e7","Type":"ContainerDied","Data":"e25e76569725769faffda61c90d89df7c9c02b8304d6f26fb95aff914603a1b2"} Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.481524 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e25e76569725769faffda61c90d89df7c9c02b8304d6f26fb95aff914603a1b2" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.481607 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f44vg" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.589600 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fl5xt"] Mar 18 12:47:48 crc kubenswrapper[4975]: E0318 12:47:48.590177 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5cb9baa-6a68-426f-88b4-4cb896f260e7" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.590215 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5cb9baa-6a68-426f-88b4-4cb896f260e7" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.590480 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5cb9baa-6a68-426f-88b4-4cb896f260e7" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.591375 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.594259 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.595405 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.595834 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.595890 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.620726 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fl5xt"] Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.628724 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-fl5xt\" (UID: \"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2\") " pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.628805 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtdkv\" (UniqueName: \"kubernetes.io/projected/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-kube-api-access-gtdkv\") pod \"ssh-known-hosts-edpm-deployment-fl5xt\" (UID: \"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2\") " pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.629015 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-fl5xt\" (UID: \"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2\") " pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.730596 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-fl5xt\" (UID: \"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2\") " pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.730652 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtdkv\" (UniqueName: \"kubernetes.io/projected/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-kube-api-access-gtdkv\") pod \"ssh-known-hosts-edpm-deployment-fl5xt\" (UID: \"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2\") " pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.730749 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-fl5xt\" (UID: \"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2\") " pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.735746 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-fl5xt\" (UID: \"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2\") " pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.737258 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-fl5xt\" (UID: \"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2\") " pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.746210 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtdkv\" (UniqueName: \"kubernetes.io/projected/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-kube-api-access-gtdkv\") pod \"ssh-known-hosts-edpm-deployment-fl5xt\" (UID: \"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2\") " pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" Mar 18 12:47:48 crc kubenswrapper[4975]: I0318 12:47:48.914571 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" Mar 18 12:47:49 crc kubenswrapper[4975]: I0318 12:47:49.497723 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fl5xt"] Mar 18 12:47:50 crc kubenswrapper[4975]: I0318 12:47:50.498582 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" event={"ID":"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2","Type":"ContainerStarted","Data":"12a7d4ab70aaf401412a75888969bb9562bdd1dd19737a697a788ba7309b8fd0"} Mar 18 12:47:50 crc kubenswrapper[4975]: I0318 12:47:50.499182 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" event={"ID":"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2","Type":"ContainerStarted","Data":"ed916cd6868ee4d93e3ea9235b0a0d96c378704b778fc15b130c1194f86e7f96"} Mar 18 12:47:50 crc kubenswrapper[4975]: I0318 12:47:50.519309 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" podStartSLOduration=2.344743975 podStartE2EDuration="2.519278701s" podCreationTimestamp="2026-03-18 12:47:48 +0000 UTC" firstStartedPulling="2026-03-18 12:47:49.503843125 +0000 UTC m=+2255.218243704" lastFinishedPulling="2026-03-18 12:47:49.678377851 +0000 UTC m=+2255.392778430" observedRunningTime="2026-03-18 12:47:50.517309377 +0000 UTC m=+2256.231709966" watchObservedRunningTime="2026-03-18 12:47:50.519278701 +0000 UTC m=+2256.233679270" Mar 18 12:47:55 crc kubenswrapper[4975]: I0318 12:47:55.539338 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:47:55 crc kubenswrapper[4975]: I0318 12:47:55.539809 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:47:55 crc kubenswrapper[4975]: I0318 12:47:55.539856 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:47:55 crc kubenswrapper[4975]: I0318 12:47:55.540646 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:47:55 crc kubenswrapper[4975]: I0318 12:47:55.540710 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" gracePeriod=600 Mar 18 12:47:55 crc kubenswrapper[4975]: E0318 12:47:55.684547 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:47:56 crc kubenswrapper[4975]: I0318 12:47:56.552634 4975 generic.go:334] "Generic (PLEG): container finished" podID="bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2" containerID="12a7d4ab70aaf401412a75888969bb9562bdd1dd19737a697a788ba7309b8fd0" exitCode=0 Mar 18 12:47:56 crc kubenswrapper[4975]: I0318 12:47:56.552682 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" event={"ID":"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2","Type":"ContainerDied","Data":"12a7d4ab70aaf401412a75888969bb9562bdd1dd19737a697a788ba7309b8fd0"} Mar 18 12:47:56 crc kubenswrapper[4975]: I0318 12:47:56.559900 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" exitCode=0 Mar 18 12:47:56 crc kubenswrapper[4975]: I0318 12:47:56.560003 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3"} Mar 18 12:47:56 crc kubenswrapper[4975]: I0318 12:47:56.560094 4975 scope.go:117] "RemoveContainer" containerID="e1445fab85886838c0ceb2338548e8dbe5d869da1c9483fa6052e6d794e0e6d2" Mar 18 12:47:56 crc kubenswrapper[4975]: I0318 12:47:56.560752 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:47:56 crc kubenswrapper[4975]: E0318 12:47:56.561089 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.060175 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.220360 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-inventory-0\") pod \"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2\" (UID: \"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2\") " Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.220476 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtdkv\" (UniqueName: \"kubernetes.io/projected/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-kube-api-access-gtdkv\") pod \"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2\" (UID: \"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2\") " Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.220526 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-ssh-key-openstack-edpm-ipam\") pod \"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2\" (UID: \"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2\") " Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.226456 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-kube-api-access-gtdkv" (OuterVolumeSpecName: "kube-api-access-gtdkv") pod "bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2" (UID: "bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2"). InnerVolumeSpecName "kube-api-access-gtdkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.250030 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2" (UID: "bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.254707 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2" (UID: "bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.322720 4975 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.322751 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtdkv\" (UniqueName: \"kubernetes.io/projected/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-kube-api-access-gtdkv\") on node \"crc\" DevicePath \"\"" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.322762 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.584232 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" event={"ID":"bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2","Type":"ContainerDied","Data":"ed916cd6868ee4d93e3ea9235b0a0d96c378704b778fc15b130c1194f86e7f96"} Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.584274 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed916cd6868ee4d93e3ea9235b0a0d96c378704b778fc15b130c1194f86e7f96" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.584335 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fl5xt" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.648239 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs"] Mar 18 12:47:58 crc kubenswrapper[4975]: E0318 12:47:58.648727 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2" containerName="ssh-known-hosts-edpm-deployment" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.648753 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2" containerName="ssh-known-hosts-edpm-deployment" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.649039 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2" containerName="ssh-known-hosts-edpm-deployment" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.649837 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.652165 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.652794 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.653766 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.659470 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.670115 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs"] Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.832592 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7508606e-78ed-475c-9386-b1cc11127fb1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6qcs\" (UID: \"7508606e-78ed-475c-9386-b1cc11127fb1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.832763 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dskv\" (UniqueName: \"kubernetes.io/projected/7508606e-78ed-475c-9386-b1cc11127fb1-kube-api-access-8dskv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6qcs\" (UID: \"7508606e-78ed-475c-9386-b1cc11127fb1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.832811 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7508606e-78ed-475c-9386-b1cc11127fb1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6qcs\" (UID: \"7508606e-78ed-475c-9386-b1cc11127fb1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.935027 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dskv\" (UniqueName: \"kubernetes.io/projected/7508606e-78ed-475c-9386-b1cc11127fb1-kube-api-access-8dskv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6qcs\" (UID: \"7508606e-78ed-475c-9386-b1cc11127fb1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.935115 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7508606e-78ed-475c-9386-b1cc11127fb1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6qcs\" (UID: \"7508606e-78ed-475c-9386-b1cc11127fb1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.935341 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7508606e-78ed-475c-9386-b1cc11127fb1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6qcs\" (UID: \"7508606e-78ed-475c-9386-b1cc11127fb1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.938833 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7508606e-78ed-475c-9386-b1cc11127fb1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6qcs\" (UID: \"7508606e-78ed-475c-9386-b1cc11127fb1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.939145 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7508606e-78ed-475c-9386-b1cc11127fb1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6qcs\" (UID: \"7508606e-78ed-475c-9386-b1cc11127fb1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.956796 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dskv\" (UniqueName: \"kubernetes.io/projected/7508606e-78ed-475c-9386-b1cc11127fb1-kube-api-access-8dskv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6qcs\" (UID: \"7508606e-78ed-475c-9386-b1cc11127fb1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" Mar 18 12:47:58 crc kubenswrapper[4975]: I0318 12:47:58.973937 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" Mar 18 12:47:59 crc kubenswrapper[4975]: I0318 12:47:59.512570 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs"] Mar 18 12:47:59 crc kubenswrapper[4975]: I0318 12:47:59.592435 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" event={"ID":"7508606e-78ed-475c-9386-b1cc11127fb1","Type":"ContainerStarted","Data":"d0375d052847c8d209c420251f997d41b47b18b23f42dd1780895e3ff91c4a88"} Mar 18 12:48:00 crc kubenswrapper[4975]: I0318 12:48:00.177909 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563968-jzvj4"] Mar 18 12:48:00 crc kubenswrapper[4975]: I0318 12:48:00.180310 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563968-jzvj4" Mar 18 12:48:00 crc kubenswrapper[4975]: I0318 12:48:00.183023 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:48:00 crc kubenswrapper[4975]: I0318 12:48:00.183258 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:48:00 crc kubenswrapper[4975]: I0318 12:48:00.184919 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:48:00 crc kubenswrapper[4975]: I0318 12:48:00.192101 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563968-jzvj4"] Mar 18 12:48:00 crc kubenswrapper[4975]: I0318 12:48:00.262543 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7d29\" (UniqueName: \"kubernetes.io/projected/d0f0cc91-cbf4-40dd-aef8-e75eac7fe944-kube-api-access-c7d29\") pod \"auto-csr-approver-29563968-jzvj4\" (UID: \"d0f0cc91-cbf4-40dd-aef8-e75eac7fe944\") " pod="openshift-infra/auto-csr-approver-29563968-jzvj4" Mar 18 12:48:00 crc kubenswrapper[4975]: I0318 12:48:00.364045 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7d29\" (UniqueName: \"kubernetes.io/projected/d0f0cc91-cbf4-40dd-aef8-e75eac7fe944-kube-api-access-c7d29\") pod \"auto-csr-approver-29563968-jzvj4\" (UID: \"d0f0cc91-cbf4-40dd-aef8-e75eac7fe944\") " pod="openshift-infra/auto-csr-approver-29563968-jzvj4" Mar 18 12:48:00 crc kubenswrapper[4975]: I0318 12:48:00.380057 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7d29\" (UniqueName: \"kubernetes.io/projected/d0f0cc91-cbf4-40dd-aef8-e75eac7fe944-kube-api-access-c7d29\") pod \"auto-csr-approver-29563968-jzvj4\" (UID: \"d0f0cc91-cbf4-40dd-aef8-e75eac7fe944\") " pod="openshift-infra/auto-csr-approver-29563968-jzvj4" Mar 18 12:48:00 crc kubenswrapper[4975]: I0318 12:48:00.502598 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563968-jzvj4" Mar 18 12:48:00 crc kubenswrapper[4975]: I0318 12:48:00.615882 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" event={"ID":"7508606e-78ed-475c-9386-b1cc11127fb1","Type":"ContainerStarted","Data":"f3c81e9c7eff4d01985f28a00b90f1ca91ecd645d22c91433b00e56f41cf46b2"} Mar 18 12:48:00 crc kubenswrapper[4975]: I0318 12:48:00.639200 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" podStartSLOduration=2.471558515 podStartE2EDuration="2.639168662s" podCreationTimestamp="2026-03-18 12:47:58 +0000 UTC" firstStartedPulling="2026-03-18 12:47:59.510670725 +0000 UTC m=+2265.225071304" lastFinishedPulling="2026-03-18 12:47:59.678280872 +0000 UTC m=+2265.392681451" observedRunningTime="2026-03-18 12:48:00.635416209 +0000 UTC m=+2266.349816788" watchObservedRunningTime="2026-03-18 12:48:00.639168662 +0000 UTC m=+2266.353569241" Mar 18 12:48:00 crc kubenswrapper[4975]: I0318 12:48:00.971997 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563968-jzvj4"] Mar 18 12:48:00 crc kubenswrapper[4975]: W0318 12:48:00.992090 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0f0cc91_cbf4_40dd_aef8_e75eac7fe944.slice/crio-04abb9378fc30c6b88495f1f7523e4298d5b202f988553e8ef992085ec275b87 WatchSource:0}: Error finding container 04abb9378fc30c6b88495f1f7523e4298d5b202f988553e8ef992085ec275b87: Status 404 returned error can't find the container with id 04abb9378fc30c6b88495f1f7523e4298d5b202f988553e8ef992085ec275b87 Mar 18 12:48:01 crc kubenswrapper[4975]: I0318 12:48:01.627253 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563968-jzvj4" event={"ID":"d0f0cc91-cbf4-40dd-aef8-e75eac7fe944","Type":"ContainerStarted","Data":"04abb9378fc30c6b88495f1f7523e4298d5b202f988553e8ef992085ec275b87"} Mar 18 12:48:02 crc kubenswrapper[4975]: I0318 12:48:02.638484 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563968-jzvj4" event={"ID":"d0f0cc91-cbf4-40dd-aef8-e75eac7fe944","Type":"ContainerStarted","Data":"f51893106ca186c5877fb52ab49d03caa19bc36ffb0ac3f34440efab822c6e40"} Mar 18 12:48:02 crc kubenswrapper[4975]: I0318 12:48:02.667614 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563968-jzvj4" podStartSLOduration=1.4969324419999999 podStartE2EDuration="2.667583285s" podCreationTimestamp="2026-03-18 12:48:00 +0000 UTC" firstStartedPulling="2026-03-18 12:48:00.994738222 +0000 UTC m=+2266.709138801" lastFinishedPulling="2026-03-18 12:48:02.165389065 +0000 UTC m=+2267.879789644" observedRunningTime="2026-03-18 12:48:02.654498436 +0000 UTC m=+2268.368899025" watchObservedRunningTime="2026-03-18 12:48:02.667583285 +0000 UTC m=+2268.381983884" Mar 18 12:48:03 crc kubenswrapper[4975]: I0318 12:48:03.649562 4975 generic.go:334] "Generic (PLEG): container finished" podID="d0f0cc91-cbf4-40dd-aef8-e75eac7fe944" containerID="f51893106ca186c5877fb52ab49d03caa19bc36ffb0ac3f34440efab822c6e40" exitCode=0 Mar 18 12:48:03 crc kubenswrapper[4975]: I0318 12:48:03.649655 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563968-jzvj4" event={"ID":"d0f0cc91-cbf4-40dd-aef8-e75eac7fe944","Type":"ContainerDied","Data":"f51893106ca186c5877fb52ab49d03caa19bc36ffb0ac3f34440efab822c6e40"} Mar 18 12:48:05 crc kubenswrapper[4975]: I0318 12:48:05.035704 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563968-jzvj4" Mar 18 12:48:05 crc kubenswrapper[4975]: I0318 12:48:05.075444 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7d29\" (UniqueName: \"kubernetes.io/projected/d0f0cc91-cbf4-40dd-aef8-e75eac7fe944-kube-api-access-c7d29\") pod \"d0f0cc91-cbf4-40dd-aef8-e75eac7fe944\" (UID: \"d0f0cc91-cbf4-40dd-aef8-e75eac7fe944\") " Mar 18 12:48:05 crc kubenswrapper[4975]: I0318 12:48:05.082722 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f0cc91-cbf4-40dd-aef8-e75eac7fe944-kube-api-access-c7d29" (OuterVolumeSpecName: "kube-api-access-c7d29") pod "d0f0cc91-cbf4-40dd-aef8-e75eac7fe944" (UID: "d0f0cc91-cbf4-40dd-aef8-e75eac7fe944"). InnerVolumeSpecName "kube-api-access-c7d29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:48:05 crc kubenswrapper[4975]: I0318 12:48:05.178164 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7d29\" (UniqueName: \"kubernetes.io/projected/d0f0cc91-cbf4-40dd-aef8-e75eac7fe944-kube-api-access-c7d29\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:05 crc kubenswrapper[4975]: I0318 12:48:05.670722 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563968-jzvj4" event={"ID":"d0f0cc91-cbf4-40dd-aef8-e75eac7fe944","Type":"ContainerDied","Data":"04abb9378fc30c6b88495f1f7523e4298d5b202f988553e8ef992085ec275b87"} Mar 18 12:48:05 crc kubenswrapper[4975]: I0318 12:48:05.671127 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04abb9378fc30c6b88495f1f7523e4298d5b202f988553e8ef992085ec275b87" Mar 18 12:48:05 crc kubenswrapper[4975]: I0318 12:48:05.670791 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563968-jzvj4" Mar 18 12:48:05 crc kubenswrapper[4975]: I0318 12:48:05.744160 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563962-jg5vd"] Mar 18 12:48:05 crc kubenswrapper[4975]: I0318 12:48:05.751290 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563962-jg5vd"] Mar 18 12:48:07 crc kubenswrapper[4975]: I0318 12:48:07.030207 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e80a4707-2d9f-43da-864c-c970bee90b18" path="/var/lib/kubelet/pods/e80a4707-2d9f-43da-864c-c970bee90b18/volumes" Mar 18 12:48:08 crc kubenswrapper[4975]: I0318 12:48:08.697939 4975 generic.go:334] "Generic (PLEG): container finished" podID="7508606e-78ed-475c-9386-b1cc11127fb1" containerID="f3c81e9c7eff4d01985f28a00b90f1ca91ecd645d22c91433b00e56f41cf46b2" exitCode=0 Mar 18 12:48:08 crc kubenswrapper[4975]: I0318 12:48:08.698069 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" event={"ID":"7508606e-78ed-475c-9386-b1cc11127fb1","Type":"ContainerDied","Data":"f3c81e9c7eff4d01985f28a00b90f1ca91ecd645d22c91433b00e56f41cf46b2"} Mar 18 12:48:09 crc kubenswrapper[4975]: I0318 12:48:09.016472 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:48:09 crc kubenswrapper[4975]: E0318 12:48:09.016938 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:48:10 crc kubenswrapper[4975]: I0318 12:48:10.236177 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" Mar 18 12:48:10 crc kubenswrapper[4975]: I0318 12:48:10.297379 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7508606e-78ed-475c-9386-b1cc11127fb1-inventory\") pod \"7508606e-78ed-475c-9386-b1cc11127fb1\" (UID: \"7508606e-78ed-475c-9386-b1cc11127fb1\") " Mar 18 12:48:10 crc kubenswrapper[4975]: I0318 12:48:10.297468 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7508606e-78ed-475c-9386-b1cc11127fb1-ssh-key-openstack-edpm-ipam\") pod \"7508606e-78ed-475c-9386-b1cc11127fb1\" (UID: \"7508606e-78ed-475c-9386-b1cc11127fb1\") " Mar 18 12:48:10 crc kubenswrapper[4975]: I0318 12:48:10.297670 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dskv\" (UniqueName: \"kubernetes.io/projected/7508606e-78ed-475c-9386-b1cc11127fb1-kube-api-access-8dskv\") pod \"7508606e-78ed-475c-9386-b1cc11127fb1\" (UID: \"7508606e-78ed-475c-9386-b1cc11127fb1\") " Mar 18 12:48:10 crc kubenswrapper[4975]: I0318 12:48:10.310952 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7508606e-78ed-475c-9386-b1cc11127fb1-kube-api-access-8dskv" (OuterVolumeSpecName: "kube-api-access-8dskv") pod "7508606e-78ed-475c-9386-b1cc11127fb1" (UID: "7508606e-78ed-475c-9386-b1cc11127fb1"). InnerVolumeSpecName "kube-api-access-8dskv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:48:10 crc kubenswrapper[4975]: I0318 12:48:10.349578 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7508606e-78ed-475c-9386-b1cc11127fb1-inventory" (OuterVolumeSpecName: "inventory") pod "7508606e-78ed-475c-9386-b1cc11127fb1" (UID: "7508606e-78ed-475c-9386-b1cc11127fb1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:48:10 crc kubenswrapper[4975]: I0318 12:48:10.370158 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7508606e-78ed-475c-9386-b1cc11127fb1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7508606e-78ed-475c-9386-b1cc11127fb1" (UID: "7508606e-78ed-475c-9386-b1cc11127fb1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:48:10 crc kubenswrapper[4975]: I0318 12:48:10.402194 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7508606e-78ed-475c-9386-b1cc11127fb1-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:10 crc kubenswrapper[4975]: I0318 12:48:10.402248 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7508606e-78ed-475c-9386-b1cc11127fb1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:10 crc kubenswrapper[4975]: I0318 12:48:10.402263 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dskv\" (UniqueName: \"kubernetes.io/projected/7508606e-78ed-475c-9386-b1cc11127fb1-kube-api-access-8dskv\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:10 crc kubenswrapper[4975]: I0318 12:48:10.732697 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" event={"ID":"7508606e-78ed-475c-9386-b1cc11127fb1","Type":"ContainerDied","Data":"d0375d052847c8d209c420251f997d41b47b18b23f42dd1780895e3ff91c4a88"} Mar 18 12:48:10 crc kubenswrapper[4975]: I0318 12:48:10.732750 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0375d052847c8d209c420251f997d41b47b18b23f42dd1780895e3ff91c4a88" Mar 18 12:48:10 crc kubenswrapper[4975]: I0318 12:48:10.733067 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6qcs" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.035192 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb"] Mar 18 12:48:11 crc kubenswrapper[4975]: E0318 12:48:11.035578 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f0cc91-cbf4-40dd-aef8-e75eac7fe944" containerName="oc" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.035601 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f0cc91-cbf4-40dd-aef8-e75eac7fe944" containerName="oc" Mar 18 12:48:11 crc kubenswrapper[4975]: E0318 12:48:11.035630 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7508606e-78ed-475c-9386-b1cc11127fb1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.035640 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="7508606e-78ed-475c-9386-b1cc11127fb1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.035922 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f0cc91-cbf4-40dd-aef8-e75eac7fe944" containerName="oc" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.035961 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="7508606e-78ed-475c-9386-b1cc11127fb1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.036638 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb"] Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.036729 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.041405 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.041709 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.055791 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.056304 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.116512 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bddaeca5-c768-4480-96cd-ef43fd303bc8-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb\" (UID: \"bddaeca5-c768-4480-96cd-ef43fd303bc8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.116581 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bddaeca5-c768-4480-96cd-ef43fd303bc8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb\" (UID: \"bddaeca5-c768-4480-96cd-ef43fd303bc8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.116606 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvk72\" (UniqueName: \"kubernetes.io/projected/bddaeca5-c768-4480-96cd-ef43fd303bc8-kube-api-access-tvk72\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb\" (UID: \"bddaeca5-c768-4480-96cd-ef43fd303bc8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.218348 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvk72\" (UniqueName: \"kubernetes.io/projected/bddaeca5-c768-4480-96cd-ef43fd303bc8-kube-api-access-tvk72\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb\" (UID: \"bddaeca5-c768-4480-96cd-ef43fd303bc8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.218960 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bddaeca5-c768-4480-96cd-ef43fd303bc8-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb\" (UID: \"bddaeca5-c768-4480-96cd-ef43fd303bc8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.219029 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bddaeca5-c768-4480-96cd-ef43fd303bc8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb\" (UID: \"bddaeca5-c768-4480-96cd-ef43fd303bc8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.223771 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bddaeca5-c768-4480-96cd-ef43fd303bc8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb\" (UID: \"bddaeca5-c768-4480-96cd-ef43fd303bc8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.225531 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bddaeca5-c768-4480-96cd-ef43fd303bc8-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb\" (UID: \"bddaeca5-c768-4480-96cd-ef43fd303bc8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.251481 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvk72\" (UniqueName: \"kubernetes.io/projected/bddaeca5-c768-4480-96cd-ef43fd303bc8-kube-api-access-tvk72\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb\" (UID: \"bddaeca5-c768-4480-96cd-ef43fd303bc8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.364381 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" Mar 18 12:48:11 crc kubenswrapper[4975]: I0318 12:48:11.969216 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb"] Mar 18 12:48:12 crc kubenswrapper[4975]: I0318 12:48:12.787898 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" event={"ID":"bddaeca5-c768-4480-96cd-ef43fd303bc8","Type":"ContainerStarted","Data":"a4d1b0e212baa45576a257a0538cae4fa13de63a714c64acd5782255c6f09904"} Mar 18 12:48:12 crc kubenswrapper[4975]: I0318 12:48:12.788421 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" event={"ID":"bddaeca5-c768-4480-96cd-ef43fd303bc8","Type":"ContainerStarted","Data":"1922e8e4959044414284ab82a5d4eb3a92ad411cf97defc36275e4d9b0787414"} Mar 18 12:48:12 crc kubenswrapper[4975]: I0318 12:48:12.810289 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" podStartSLOduration=2.618534904 podStartE2EDuration="2.810271192s" podCreationTimestamp="2026-03-18 12:48:10 +0000 UTC" firstStartedPulling="2026-03-18 12:48:11.99335604 +0000 UTC m=+2277.707756619" lastFinishedPulling="2026-03-18 12:48:12.185092288 +0000 UTC m=+2277.899492907" observedRunningTime="2026-03-18 12:48:12.803773984 +0000 UTC m=+2278.518174563" watchObservedRunningTime="2026-03-18 12:48:12.810271192 +0000 UTC m=+2278.524671771" Mar 18 12:48:22 crc kubenswrapper[4975]: I0318 12:48:22.899368 4975 generic.go:334] "Generic (PLEG): container finished" podID="bddaeca5-c768-4480-96cd-ef43fd303bc8" containerID="a4d1b0e212baa45576a257a0538cae4fa13de63a714c64acd5782255c6f09904" exitCode=0 Mar 18 12:48:22 crc kubenswrapper[4975]: I0318 12:48:22.899427 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" event={"ID":"bddaeca5-c768-4480-96cd-ef43fd303bc8","Type":"ContainerDied","Data":"a4d1b0e212baa45576a257a0538cae4fa13de63a714c64acd5782255c6f09904"} Mar 18 12:48:24 crc kubenswrapper[4975]: I0318 12:48:24.017522 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:48:24 crc kubenswrapper[4975]: E0318 12:48:24.018128 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:48:24 crc kubenswrapper[4975]: I0318 12:48:24.392762 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" Mar 18 12:48:24 crc kubenswrapper[4975]: I0318 12:48:24.512186 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bddaeca5-c768-4480-96cd-ef43fd303bc8-ssh-key-openstack-edpm-ipam\") pod \"bddaeca5-c768-4480-96cd-ef43fd303bc8\" (UID: \"bddaeca5-c768-4480-96cd-ef43fd303bc8\") " Mar 18 12:48:24 crc kubenswrapper[4975]: I0318 12:48:24.512449 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvk72\" (UniqueName: \"kubernetes.io/projected/bddaeca5-c768-4480-96cd-ef43fd303bc8-kube-api-access-tvk72\") pod \"bddaeca5-c768-4480-96cd-ef43fd303bc8\" (UID: \"bddaeca5-c768-4480-96cd-ef43fd303bc8\") " Mar 18 12:48:24 crc kubenswrapper[4975]: I0318 12:48:24.512490 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bddaeca5-c768-4480-96cd-ef43fd303bc8-inventory\") pod \"bddaeca5-c768-4480-96cd-ef43fd303bc8\" (UID: \"bddaeca5-c768-4480-96cd-ef43fd303bc8\") " Mar 18 12:48:24 crc kubenswrapper[4975]: I0318 12:48:24.518326 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bddaeca5-c768-4480-96cd-ef43fd303bc8-kube-api-access-tvk72" (OuterVolumeSpecName: "kube-api-access-tvk72") pod "bddaeca5-c768-4480-96cd-ef43fd303bc8" (UID: "bddaeca5-c768-4480-96cd-ef43fd303bc8"). InnerVolumeSpecName "kube-api-access-tvk72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:48:24 crc kubenswrapper[4975]: I0318 12:48:24.543218 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bddaeca5-c768-4480-96cd-ef43fd303bc8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bddaeca5-c768-4480-96cd-ef43fd303bc8" (UID: "bddaeca5-c768-4480-96cd-ef43fd303bc8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:48:24 crc kubenswrapper[4975]: I0318 12:48:24.556464 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bddaeca5-c768-4480-96cd-ef43fd303bc8-inventory" (OuterVolumeSpecName: "inventory") pod "bddaeca5-c768-4480-96cd-ef43fd303bc8" (UID: "bddaeca5-c768-4480-96cd-ef43fd303bc8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:48:24 crc kubenswrapper[4975]: I0318 12:48:24.616099 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bddaeca5-c768-4480-96cd-ef43fd303bc8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:24 crc kubenswrapper[4975]: I0318 12:48:24.616159 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvk72\" (UniqueName: \"kubernetes.io/projected/bddaeca5-c768-4480-96cd-ef43fd303bc8-kube-api-access-tvk72\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:24 crc kubenswrapper[4975]: I0318 12:48:24.616175 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bddaeca5-c768-4480-96cd-ef43fd303bc8-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:48:24 crc kubenswrapper[4975]: I0318 12:48:24.934507 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" event={"ID":"bddaeca5-c768-4480-96cd-ef43fd303bc8","Type":"ContainerDied","Data":"1922e8e4959044414284ab82a5d4eb3a92ad411cf97defc36275e4d9b0787414"} Mar 18 12:48:24 crc kubenswrapper[4975]: I0318 12:48:24.934629 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1922e8e4959044414284ab82a5d4eb3a92ad411cf97defc36275e4d9b0787414" Mar 18 12:48:24 crc kubenswrapper[4975]: I0318 12:48:24.934744 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.050396 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt"] Mar 18 12:48:25 crc kubenswrapper[4975]: E0318 12:48:25.050983 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bddaeca5-c768-4480-96cd-ef43fd303bc8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.051009 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="bddaeca5-c768-4480-96cd-ef43fd303bc8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.051355 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="bddaeca5-c768-4480-96cd-ef43fd303bc8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.052352 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.057493 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.057594 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.058386 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.058774 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.058797 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.058922 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.058958 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.059177 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.066933 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt"] Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.132963 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxn9b\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-kube-api-access-xxn9b\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.133049 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.133091 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.133227 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.133344 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.133423 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.133462 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.133493 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.133528 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.133636 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.133783 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.133835 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.133913 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.133945 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.234847 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.234987 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.235019 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.235514 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.235562 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.235601 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxn9b\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-kube-api-access-xxn9b\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.235634 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.235666 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.235714 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.235763 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.235801 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.235829 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.235855 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.235900 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.239442 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.240325 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.240704 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.241124 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.241124 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.241768 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.241867 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.242227 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.250688 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.251299 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.251558 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.251671 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.252283 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.254376 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxn9b\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-kube-api-access-xxn9b\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.374086 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.762761 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt"] Mar 18 12:48:25 crc kubenswrapper[4975]: I0318 12:48:25.952203 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" event={"ID":"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8","Type":"ContainerStarted","Data":"73ae6b00527db6c7f1a099b7155944509ffec5319a0a1e47b858b1a6820dee64"} Mar 18 12:48:26 crc kubenswrapper[4975]: I0318 12:48:26.962197 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" event={"ID":"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8","Type":"ContainerStarted","Data":"160e4d3338af3b4e93a17fd52a420f74c26f130d92654bc31935916a8b13ae51"} Mar 18 12:48:26 crc kubenswrapper[4975]: I0318 12:48:26.987331 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" podStartSLOduration=1.792168937 podStartE2EDuration="1.987309869s" podCreationTimestamp="2026-03-18 12:48:25 +0000 UTC" firstStartedPulling="2026-03-18 12:48:25.771161789 +0000 UTC m=+2291.485562378" lastFinishedPulling="2026-03-18 12:48:25.966302701 +0000 UTC m=+2291.680703310" observedRunningTime="2026-03-18 12:48:26.983070753 +0000 UTC m=+2292.697471342" watchObservedRunningTime="2026-03-18 12:48:26.987309869 +0000 UTC m=+2292.701710448" Mar 18 12:48:39 crc kubenswrapper[4975]: I0318 12:48:39.017564 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:48:39 crc kubenswrapper[4975]: E0318 12:48:39.018933 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:48:47 crc kubenswrapper[4975]: I0318 12:48:47.288089 4975 scope.go:117] "RemoveContainer" containerID="14c2361ca3b33c8d60d613e9c1f4341986a62e181b9a187575e93598feda6da2" Mar 18 12:48:51 crc kubenswrapper[4975]: I0318 12:48:51.017823 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:48:51 crc kubenswrapper[4975]: E0318 12:48:51.019842 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:49:03 crc kubenswrapper[4975]: I0318 12:49:03.017159 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:49:03 crc kubenswrapper[4975]: E0318 12:49:03.017907 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:49:03 crc kubenswrapper[4975]: I0318 12:49:03.349848 4975 generic.go:334] "Generic (PLEG): container finished" podID="8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" containerID="160e4d3338af3b4e93a17fd52a420f74c26f130d92654bc31935916a8b13ae51" exitCode=0 Mar 18 12:49:03 crc kubenswrapper[4975]: I0318 12:49:03.349925 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" event={"ID":"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8","Type":"ContainerDied","Data":"160e4d3338af3b4e93a17fd52a420f74c26f130d92654bc31935916a8b13ae51"} Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.367287 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" event={"ID":"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8","Type":"ContainerDied","Data":"73ae6b00527db6c7f1a099b7155944509ffec5319a0a1e47b858b1a6820dee64"} Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.367911 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73ae6b00527db6c7f1a099b7155944509ffec5319a0a1e47b858b1a6820dee64" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.373305 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.385804 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-libvirt-combined-ca-bundle\") pod \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.385919 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-ssh-key-openstack-edpm-ipam\") pod \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.386060 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-ovn-combined-ca-bundle\") pod \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.386105 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-telemetry-combined-ca-bundle\") pod \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.386195 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-repo-setup-combined-ca-bundle\") pod \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.386283 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-neutron-metadata-combined-ca-bundle\") pod \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.386353 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-inventory\") pod \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.386405 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxn9b\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-kube-api-access-xxn9b\") pod \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.386511 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-nova-combined-ca-bundle\") pod \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.386537 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.386594 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.386675 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.386723 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.386783 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-bootstrap-combined-ca-bundle\") pod \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\" (UID: \"8a6f8f80-45d5-428b-ae5f-0b770f0aefd8\") " Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.432052 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" (UID: "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.433186 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-kube-api-access-xxn9b" (OuterVolumeSpecName: "kube-api-access-xxn9b") pod "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" (UID: "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8"). InnerVolumeSpecName "kube-api-access-xxn9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.433274 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" (UID: "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.438226 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" (UID: "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.439536 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" (UID: "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.442750 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" (UID: "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.443275 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" (UID: "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.443285 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" (UID: "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.443551 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" (UID: "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.447184 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" (UID: "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.450736 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" (UID: "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.462423 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-inventory" (OuterVolumeSpecName: "inventory") pod "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" (UID: "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.465084 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" (UID: "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.467737 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" (UID: "8a6f8f80-45d5-428b-ae5f-0b770f0aefd8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.491344 4975 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.491382 4975 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.491394 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.491405 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxn9b\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-kube-api-access-xxn9b\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.491416 4975 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.491541 4975 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.491598 4975 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.491611 4975 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.491621 4975 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.491634 4975 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.491643 4975 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.491651 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.491660 4975 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:05 crc kubenswrapper[4975]: I0318 12:49:05.491669 4975 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6f8f80-45d5-428b-ae5f-0b770f0aefd8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.373308 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.742347 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq"] Mar 18 12:49:06 crc kubenswrapper[4975]: E0318 12:49:06.742827 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.742844 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.743140 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a6f8f80-45d5-428b-ae5f-0b770f0aefd8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.743893 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.747689 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.748057 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.748221 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.748608 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.757275 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq"] Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.765398 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.816601 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4z4n\" (UniqueName: \"kubernetes.io/projected/2603d47f-8543-4cc2-927e-5f7d2ad82acc-kube-api-access-h4z4n\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p6rfq\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.816647 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p6rfq\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.816680 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p6rfq\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.816698 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p6rfq\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.816922 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p6rfq\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.918867 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4z4n\" (UniqueName: \"kubernetes.io/projected/2603d47f-8543-4cc2-927e-5f7d2ad82acc-kube-api-access-h4z4n\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p6rfq\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.918949 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p6rfq\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.918997 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p6rfq\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.919019 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p6rfq\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.919090 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p6rfq\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.920040 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p6rfq\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.924691 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p6rfq\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.924930 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p6rfq\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.926481 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p6rfq\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:06 crc kubenswrapper[4975]: I0318 12:49:06.941584 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4z4n\" (UniqueName: \"kubernetes.io/projected/2603d47f-8543-4cc2-927e-5f7d2ad82acc-kube-api-access-h4z4n\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p6rfq\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:07 crc kubenswrapper[4975]: I0318 12:49:07.071269 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:49:07 crc kubenswrapper[4975]: I0318 12:49:07.656086 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq"] Mar 18 12:49:08 crc kubenswrapper[4975]: I0318 12:49:08.391981 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" event={"ID":"2603d47f-8543-4cc2-927e-5f7d2ad82acc","Type":"ContainerStarted","Data":"16e6b498da4c2d480703c31a099d0c25dec672eb2615bf0d6da5e1232b7f7280"} Mar 18 12:49:10 crc kubenswrapper[4975]: I0318 12:49:10.419529 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" event={"ID":"2603d47f-8543-4cc2-927e-5f7d2ad82acc","Type":"ContainerStarted","Data":"9111f121439e56d506fc4fe5c259980ff59358573d17bc43f86be603a27a49dc"} Mar 18 12:49:10 crc kubenswrapper[4975]: I0318 12:49:10.440297 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" podStartSLOduration=3.776381164 podStartE2EDuration="4.44026669s" podCreationTimestamp="2026-03-18 12:49:06 +0000 UTC" firstStartedPulling="2026-03-18 12:49:07.661811678 +0000 UTC m=+2333.376212257" lastFinishedPulling="2026-03-18 12:49:08.325697194 +0000 UTC m=+2334.040097783" observedRunningTime="2026-03-18 12:49:10.440094525 +0000 UTC m=+2336.154495114" watchObservedRunningTime="2026-03-18 12:49:10.44026669 +0000 UTC m=+2336.154667269" Mar 18 12:49:15 crc kubenswrapper[4975]: I0318 12:49:15.024737 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:49:15 crc kubenswrapper[4975]: E0318 12:49:15.025806 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:49:28 crc kubenswrapper[4975]: I0318 12:49:28.016831 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:49:28 crc kubenswrapper[4975]: E0318 12:49:28.017551 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:49:42 crc kubenswrapper[4975]: I0318 12:49:42.016813 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:49:42 crc kubenswrapper[4975]: E0318 12:49:42.018135 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:49:54 crc kubenswrapper[4975]: I0318 12:49:54.017142 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:49:54 crc kubenswrapper[4975]: E0318 12:49:54.019693 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:50:00 crc kubenswrapper[4975]: I0318 12:50:00.162848 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563970-c77tz"] Mar 18 12:50:00 crc kubenswrapper[4975]: I0318 12:50:00.165106 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563970-c77tz" Mar 18 12:50:00 crc kubenswrapper[4975]: I0318 12:50:00.170598 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:50:00 crc kubenswrapper[4975]: I0318 12:50:00.171817 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:50:00 crc kubenswrapper[4975]: I0318 12:50:00.172153 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:50:00 crc kubenswrapper[4975]: I0318 12:50:00.180503 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563970-c77tz"] Mar 18 12:50:00 crc kubenswrapper[4975]: I0318 12:50:00.227023 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djn8f\" (UniqueName: \"kubernetes.io/projected/3d98388e-202d-4034-bd27-b8e4b4ebfebc-kube-api-access-djn8f\") pod \"auto-csr-approver-29563970-c77tz\" (UID: \"3d98388e-202d-4034-bd27-b8e4b4ebfebc\") " pod="openshift-infra/auto-csr-approver-29563970-c77tz" Mar 18 12:50:00 crc kubenswrapper[4975]: I0318 12:50:00.329255 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djn8f\" (UniqueName: \"kubernetes.io/projected/3d98388e-202d-4034-bd27-b8e4b4ebfebc-kube-api-access-djn8f\") pod \"auto-csr-approver-29563970-c77tz\" (UID: \"3d98388e-202d-4034-bd27-b8e4b4ebfebc\") " pod="openshift-infra/auto-csr-approver-29563970-c77tz" Mar 18 12:50:00 crc kubenswrapper[4975]: I0318 12:50:00.356609 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djn8f\" (UniqueName: \"kubernetes.io/projected/3d98388e-202d-4034-bd27-b8e4b4ebfebc-kube-api-access-djn8f\") pod \"auto-csr-approver-29563970-c77tz\" (UID: \"3d98388e-202d-4034-bd27-b8e4b4ebfebc\") " pod="openshift-infra/auto-csr-approver-29563970-c77tz" Mar 18 12:50:00 crc kubenswrapper[4975]: I0318 12:50:00.527027 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563970-c77tz" Mar 18 12:50:01 crc kubenswrapper[4975]: I0318 12:50:01.034542 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563970-c77tz"] Mar 18 12:50:01 crc kubenswrapper[4975]: I0318 12:50:01.919704 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563970-c77tz" event={"ID":"3d98388e-202d-4034-bd27-b8e4b4ebfebc","Type":"ContainerStarted","Data":"c8ebf8073f59f3fdc073276945a3ee6c56df969b6861a6c5247febe2e723be24"} Mar 18 12:50:03 crc kubenswrapper[4975]: I0318 12:50:03.938471 4975 generic.go:334] "Generic (PLEG): container finished" podID="3d98388e-202d-4034-bd27-b8e4b4ebfebc" containerID="b4eacdbd7bacb7dc4379f35d520bb5e7d9c4da8bb2ad5589744766acb18c7e36" exitCode=0 Mar 18 12:50:03 crc kubenswrapper[4975]: I0318 12:50:03.938665 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563970-c77tz" event={"ID":"3d98388e-202d-4034-bd27-b8e4b4ebfebc","Type":"ContainerDied","Data":"b4eacdbd7bacb7dc4379f35d520bb5e7d9c4da8bb2ad5589744766acb18c7e36"} Mar 18 12:50:05 crc kubenswrapper[4975]: I0318 12:50:05.322277 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563970-c77tz" Mar 18 12:50:05 crc kubenswrapper[4975]: I0318 12:50:05.442459 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djn8f\" (UniqueName: \"kubernetes.io/projected/3d98388e-202d-4034-bd27-b8e4b4ebfebc-kube-api-access-djn8f\") pod \"3d98388e-202d-4034-bd27-b8e4b4ebfebc\" (UID: \"3d98388e-202d-4034-bd27-b8e4b4ebfebc\") " Mar 18 12:50:05 crc kubenswrapper[4975]: I0318 12:50:05.450945 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d98388e-202d-4034-bd27-b8e4b4ebfebc-kube-api-access-djn8f" (OuterVolumeSpecName: "kube-api-access-djn8f") pod "3d98388e-202d-4034-bd27-b8e4b4ebfebc" (UID: "3d98388e-202d-4034-bd27-b8e4b4ebfebc"). InnerVolumeSpecName "kube-api-access-djn8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:50:05 crc kubenswrapper[4975]: I0318 12:50:05.545884 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djn8f\" (UniqueName: \"kubernetes.io/projected/3d98388e-202d-4034-bd27-b8e4b4ebfebc-kube-api-access-djn8f\") on node \"crc\" DevicePath \"\"" Mar 18 12:50:05 crc kubenswrapper[4975]: I0318 12:50:05.957974 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563970-c77tz" event={"ID":"3d98388e-202d-4034-bd27-b8e4b4ebfebc","Type":"ContainerDied","Data":"c8ebf8073f59f3fdc073276945a3ee6c56df969b6861a6c5247febe2e723be24"} Mar 18 12:50:05 crc kubenswrapper[4975]: I0318 12:50:05.958031 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8ebf8073f59f3fdc073276945a3ee6c56df969b6861a6c5247febe2e723be24" Mar 18 12:50:05 crc kubenswrapper[4975]: I0318 12:50:05.958032 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563970-c77tz" Mar 18 12:50:06 crc kubenswrapper[4975]: I0318 12:50:06.017159 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:50:06 crc kubenswrapper[4975]: E0318 12:50:06.017552 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:50:06 crc kubenswrapper[4975]: I0318 12:50:06.406836 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563964-sp8fb"] Mar 18 12:50:06 crc kubenswrapper[4975]: I0318 12:50:06.422222 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563964-sp8fb"] Mar 18 12:50:07 crc kubenswrapper[4975]: I0318 12:50:07.028983 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e" path="/var/lib/kubelet/pods/84d3a57d-c7cc-4690-8e5a-f68bc8eb4d1e/volumes" Mar 18 12:50:09 crc kubenswrapper[4975]: I0318 12:50:09.902033 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-khvmx"] Mar 18 12:50:09 crc kubenswrapper[4975]: E0318 12:50:09.902819 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d98388e-202d-4034-bd27-b8e4b4ebfebc" containerName="oc" Mar 18 12:50:09 crc kubenswrapper[4975]: I0318 12:50:09.902836 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d98388e-202d-4034-bd27-b8e4b4ebfebc" containerName="oc" Mar 18 12:50:09 crc kubenswrapper[4975]: I0318 12:50:09.903176 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d98388e-202d-4034-bd27-b8e4b4ebfebc" containerName="oc" Mar 18 12:50:09 crc kubenswrapper[4975]: I0318 12:50:09.904951 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:09 crc kubenswrapper[4975]: I0318 12:50:09.916324 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khvmx"] Mar 18 12:50:10 crc kubenswrapper[4975]: I0318 12:50:10.033534 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-catalog-content\") pod \"community-operators-khvmx\" (UID: \"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29\") " pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:10 crc kubenswrapper[4975]: I0318 12:50:10.033629 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk6ft\" (UniqueName: \"kubernetes.io/projected/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-kube-api-access-wk6ft\") pod \"community-operators-khvmx\" (UID: \"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29\") " pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:10 crc kubenswrapper[4975]: I0318 12:50:10.033674 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-utilities\") pod \"community-operators-khvmx\" (UID: \"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29\") " pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:10 crc kubenswrapper[4975]: I0318 12:50:10.135257 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-utilities\") pod \"community-operators-khvmx\" (UID: \"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29\") " pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:10 crc kubenswrapper[4975]: I0318 12:50:10.135713 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-catalog-content\") pod \"community-operators-khvmx\" (UID: \"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29\") " pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:10 crc kubenswrapper[4975]: I0318 12:50:10.135879 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk6ft\" (UniqueName: \"kubernetes.io/projected/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-kube-api-access-wk6ft\") pod \"community-operators-khvmx\" (UID: \"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29\") " pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:10 crc kubenswrapper[4975]: I0318 12:50:10.136502 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-catalog-content\") pod \"community-operators-khvmx\" (UID: \"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29\") " pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:10 crc kubenswrapper[4975]: I0318 12:50:10.137855 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-utilities\") pod \"community-operators-khvmx\" (UID: \"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29\") " pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:10 crc kubenswrapper[4975]: I0318 12:50:10.161358 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk6ft\" (UniqueName: \"kubernetes.io/projected/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-kube-api-access-wk6ft\") pod \"community-operators-khvmx\" (UID: \"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29\") " pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:10 crc kubenswrapper[4975]: I0318 12:50:10.261747 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:10 crc kubenswrapper[4975]: I0318 12:50:10.797241 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khvmx"] Mar 18 12:50:11 crc kubenswrapper[4975]: I0318 12:50:11.000102 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khvmx" event={"ID":"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29","Type":"ContainerStarted","Data":"c038b6f934a7ad9b39da63b481a061789fd86802823f0f85609841385551dca7"} Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.014198 4975 generic.go:334] "Generic (PLEG): container finished" podID="308e5e3a-6b04-4ec2-9b17-d2f1e9509f29" containerID="317e80a04b5ca4d038da7b16a9395e4fb32a78c9e56df4a496da5aa16e739be0" exitCode=0 Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.014250 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khvmx" event={"ID":"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29","Type":"ContainerDied","Data":"317e80a04b5ca4d038da7b16a9395e4fb32a78c9e56df4a496da5aa16e739be0"} Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.095050 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r9jks"] Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.097585 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.104842 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9jks"] Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.215410 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c13265c6-9ee4-4911-823e-1219c13f7c68-catalog-content\") pod \"redhat-marketplace-r9jks\" (UID: \"c13265c6-9ee4-4911-823e-1219c13f7c68\") " pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.215458 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr9m9\" (UniqueName: \"kubernetes.io/projected/c13265c6-9ee4-4911-823e-1219c13f7c68-kube-api-access-gr9m9\") pod \"redhat-marketplace-r9jks\" (UID: \"c13265c6-9ee4-4911-823e-1219c13f7c68\") " pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.215720 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c13265c6-9ee4-4911-823e-1219c13f7c68-utilities\") pod \"redhat-marketplace-r9jks\" (UID: \"c13265c6-9ee4-4911-823e-1219c13f7c68\") " pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.317969 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c13265c6-9ee4-4911-823e-1219c13f7c68-catalog-content\") pod \"redhat-marketplace-r9jks\" (UID: \"c13265c6-9ee4-4911-823e-1219c13f7c68\") " pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.318011 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr9m9\" (UniqueName: \"kubernetes.io/projected/c13265c6-9ee4-4911-823e-1219c13f7c68-kube-api-access-gr9m9\") pod \"redhat-marketplace-r9jks\" (UID: \"c13265c6-9ee4-4911-823e-1219c13f7c68\") " pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.318079 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c13265c6-9ee4-4911-823e-1219c13f7c68-utilities\") pod \"redhat-marketplace-r9jks\" (UID: \"c13265c6-9ee4-4911-823e-1219c13f7c68\") " pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.318571 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c13265c6-9ee4-4911-823e-1219c13f7c68-catalog-content\") pod \"redhat-marketplace-r9jks\" (UID: \"c13265c6-9ee4-4911-823e-1219c13f7c68\") " pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.318586 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c13265c6-9ee4-4911-823e-1219c13f7c68-utilities\") pod \"redhat-marketplace-r9jks\" (UID: \"c13265c6-9ee4-4911-823e-1219c13f7c68\") " pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.337573 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr9m9\" (UniqueName: \"kubernetes.io/projected/c13265c6-9ee4-4911-823e-1219c13f7c68-kube-api-access-gr9m9\") pod \"redhat-marketplace-r9jks\" (UID: \"c13265c6-9ee4-4911-823e-1219c13f7c68\") " pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.473701 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:12 crc kubenswrapper[4975]: I0318 12:50:12.943450 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9jks"] Mar 18 12:50:13 crc kubenswrapper[4975]: I0318 12:50:13.032199 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9jks" event={"ID":"c13265c6-9ee4-4911-823e-1219c13f7c68","Type":"ContainerStarted","Data":"05f46efaf23867d9b0f7c7246f88b9d001a302a5b9eaaae6ba1d06fdbae7bd9e"} Mar 18 12:50:14 crc kubenswrapper[4975]: I0318 12:50:14.041999 4975 generic.go:334] "Generic (PLEG): container finished" podID="308e5e3a-6b04-4ec2-9b17-d2f1e9509f29" containerID="ed73716ae73f942de0f59e0a041fb722fd7c5d9899bb6712bc44253c0dee4bfd" exitCode=0 Mar 18 12:50:14 crc kubenswrapper[4975]: I0318 12:50:14.042566 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khvmx" event={"ID":"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29","Type":"ContainerDied","Data":"ed73716ae73f942de0f59e0a041fb722fd7c5d9899bb6712bc44253c0dee4bfd"} Mar 18 12:50:14 crc kubenswrapper[4975]: I0318 12:50:14.049852 4975 generic.go:334] "Generic (PLEG): container finished" podID="c13265c6-9ee4-4911-823e-1219c13f7c68" containerID="b770bd5530e54ca648900cbabbb44a8352f83a10265f5e5fd1cb74d870fcbacf" exitCode=0 Mar 18 12:50:14 crc kubenswrapper[4975]: I0318 12:50:14.049919 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9jks" event={"ID":"c13265c6-9ee4-4911-823e-1219c13f7c68","Type":"ContainerDied","Data":"b770bd5530e54ca648900cbabbb44a8352f83a10265f5e5fd1cb74d870fcbacf"} Mar 18 12:50:15 crc kubenswrapper[4975]: I0318 12:50:15.094161 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khvmx" event={"ID":"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29","Type":"ContainerStarted","Data":"27e823e6cbfa68dc6784331590e67ad726637985bfb61479a47693193f085324"} Mar 18 12:50:15 crc kubenswrapper[4975]: I0318 12:50:15.097635 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9jks" event={"ID":"c13265c6-9ee4-4911-823e-1219c13f7c68","Type":"ContainerStarted","Data":"358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd"} Mar 18 12:50:15 crc kubenswrapper[4975]: I0318 12:50:15.122765 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-khvmx" podStartSLOduration=3.663397819 podStartE2EDuration="6.122727289s" podCreationTimestamp="2026-03-18 12:50:09 +0000 UTC" firstStartedPulling="2026-03-18 12:50:12.016621307 +0000 UTC m=+2397.731021886" lastFinishedPulling="2026-03-18 12:50:14.475950777 +0000 UTC m=+2400.190351356" observedRunningTime="2026-03-18 12:50:15.118272017 +0000 UTC m=+2400.832672596" watchObservedRunningTime="2026-03-18 12:50:15.122727289 +0000 UTC m=+2400.837127868" Mar 18 12:50:16 crc kubenswrapper[4975]: I0318 12:50:16.107831 4975 generic.go:334] "Generic (PLEG): container finished" podID="c13265c6-9ee4-4911-823e-1219c13f7c68" containerID="358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd" exitCode=0 Mar 18 12:50:16 crc kubenswrapper[4975]: I0318 12:50:16.107975 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9jks" event={"ID":"c13265c6-9ee4-4911-823e-1219c13f7c68","Type":"ContainerDied","Data":"358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd"} Mar 18 12:50:17 crc kubenswrapper[4975]: I0318 12:50:17.122637 4975 generic.go:334] "Generic (PLEG): container finished" podID="2603d47f-8543-4cc2-927e-5f7d2ad82acc" containerID="9111f121439e56d506fc4fe5c259980ff59358573d17bc43f86be603a27a49dc" exitCode=0 Mar 18 12:50:17 crc kubenswrapper[4975]: I0318 12:50:17.122698 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" event={"ID":"2603d47f-8543-4cc2-927e-5f7d2ad82acc","Type":"ContainerDied","Data":"9111f121439e56d506fc4fe5c259980ff59358573d17bc43f86be603a27a49dc"} Mar 18 12:50:17 crc kubenswrapper[4975]: I0318 12:50:17.126481 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9jks" event={"ID":"c13265c6-9ee4-4911-823e-1219c13f7c68","Type":"ContainerStarted","Data":"f414a486273e0aa7f366813a1222f727bb0fbb0f4ee6e420c33e07f82eac5098"} Mar 18 12:50:17 crc kubenswrapper[4975]: I0318 12:50:17.173560 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r9jks" podStartSLOduration=2.998917367 podStartE2EDuration="5.173533034s" podCreationTimestamp="2026-03-18 12:50:12 +0000 UTC" firstStartedPulling="2026-03-18 12:50:14.053096542 +0000 UTC m=+2399.767497121" lastFinishedPulling="2026-03-18 12:50:16.227712209 +0000 UTC m=+2401.942112788" observedRunningTime="2026-03-18 12:50:17.167270184 +0000 UTC m=+2402.881670763" watchObservedRunningTime="2026-03-18 12:50:17.173533034 +0000 UTC m=+2402.887933623" Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.572877 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.642177 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-inventory\") pod \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.642272 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ovncontroller-config-0\") pod \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.642306 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4z4n\" (UniqueName: \"kubernetes.io/projected/2603d47f-8543-4cc2-927e-5f7d2ad82acc-kube-api-access-h4z4n\") pod \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.642440 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ssh-key-openstack-edpm-ipam\") pod \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.642515 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ovn-combined-ca-bundle\") pod \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\" (UID: \"2603d47f-8543-4cc2-927e-5f7d2ad82acc\") " Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.647883 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "2603d47f-8543-4cc2-927e-5f7d2ad82acc" (UID: "2603d47f-8543-4cc2-927e-5f7d2ad82acc"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.654161 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2603d47f-8543-4cc2-927e-5f7d2ad82acc-kube-api-access-h4z4n" (OuterVolumeSpecName: "kube-api-access-h4z4n") pod "2603d47f-8543-4cc2-927e-5f7d2ad82acc" (UID: "2603d47f-8543-4cc2-927e-5f7d2ad82acc"). InnerVolumeSpecName "kube-api-access-h4z4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.672649 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "2603d47f-8543-4cc2-927e-5f7d2ad82acc" (UID: "2603d47f-8543-4cc2-927e-5f7d2ad82acc"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.679450 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2603d47f-8543-4cc2-927e-5f7d2ad82acc" (UID: "2603d47f-8543-4cc2-927e-5f7d2ad82acc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.688133 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-inventory" (OuterVolumeSpecName: "inventory") pod "2603d47f-8543-4cc2-927e-5f7d2ad82acc" (UID: "2603d47f-8543-4cc2-927e-5f7d2ad82acc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.745528 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.745734 4975 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.745799 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4z4n\" (UniqueName: \"kubernetes.io/projected/2603d47f-8543-4cc2-927e-5f7d2ad82acc-kube-api-access-h4z4n\") on node \"crc\" DevicePath \"\"" Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.745918 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:50:18 crc kubenswrapper[4975]: I0318 12:50:18.745995 4975 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2603d47f-8543-4cc2-927e-5f7d2ad82acc-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.150751 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" event={"ID":"2603d47f-8543-4cc2-927e-5f7d2ad82acc","Type":"ContainerDied","Data":"16e6b498da4c2d480703c31a099d0c25dec672eb2615bf0d6da5e1232b7f7280"} Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.150789 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p6rfq" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.150801 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16e6b498da4c2d480703c31a099d0c25dec672eb2615bf0d6da5e1232b7f7280" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.270291 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl"] Mar 18 12:50:19 crc kubenswrapper[4975]: E0318 12:50:19.270793 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2603d47f-8543-4cc2-927e-5f7d2ad82acc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.270821 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="2603d47f-8543-4cc2-927e-5f7d2ad82acc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.271135 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="2603d47f-8543-4cc2-927e-5f7d2ad82acc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.271935 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.279350 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.279377 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.279427 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.279342 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.279347 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.279585 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl"] Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.280081 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.357302 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.357439 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.357489 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk924\" (UniqueName: \"kubernetes.io/projected/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-kube-api-access-qk924\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.357547 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.357588 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.357674 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.460107 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.460247 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.460385 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.460432 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.460913 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.460977 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk924\" (UniqueName: \"kubernetes.io/projected/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-kube-api-access-qk924\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.466693 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.466759 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.467014 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.467611 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.470621 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.483421 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk924\" (UniqueName: \"kubernetes.io/projected/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-kube-api-access-qk924\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:19 crc kubenswrapper[4975]: I0318 12:50:19.607485 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:50:20 crc kubenswrapper[4975]: E0318 12:50:20.162486 4975 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13265c6_9ee4_4911_823e_1219c13f7c68.slice/crio-conmon-358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13265c6_9ee4_4911_823e_1219c13f7c68.slice/crio-358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:50:20 crc kubenswrapper[4975]: I0318 12:50:20.221586 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl"] Mar 18 12:50:20 crc kubenswrapper[4975]: I0318 12:50:20.262236 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:20 crc kubenswrapper[4975]: I0318 12:50:20.262300 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:20 crc kubenswrapper[4975]: I0318 12:50:20.328450 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:21 crc kubenswrapper[4975]: I0318 12:50:21.016765 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:50:21 crc kubenswrapper[4975]: E0318 12:50:21.017056 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:50:21 crc kubenswrapper[4975]: I0318 12:50:21.175333 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" event={"ID":"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac","Type":"ContainerStarted","Data":"1d719c61915725701b627e06aa1fe7629ec26660619a8519bd52371a84a259a5"} Mar 18 12:50:21 crc kubenswrapper[4975]: I0318 12:50:21.235012 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:22 crc kubenswrapper[4975]: I0318 12:50:22.078740 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-khvmx"] Mar 18 12:50:22 crc kubenswrapper[4975]: I0318 12:50:22.188210 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" event={"ID":"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac","Type":"ContainerStarted","Data":"42ec307fd9a8d8e03e44ea5fecb1502e7a83350d7d377a4bfa1588fd981f0391"} Mar 18 12:50:22 crc kubenswrapper[4975]: I0318 12:50:22.220915 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" podStartSLOduration=2.111222411 podStartE2EDuration="3.220887008s" podCreationTimestamp="2026-03-18 12:50:19 +0000 UTC" firstStartedPulling="2026-03-18 12:50:20.224907836 +0000 UTC m=+2405.939308405" lastFinishedPulling="2026-03-18 12:50:21.334572423 +0000 UTC m=+2407.048973002" observedRunningTime="2026-03-18 12:50:22.209710383 +0000 UTC m=+2407.924110992" watchObservedRunningTime="2026-03-18 12:50:22.220887008 +0000 UTC m=+2407.935287597" Mar 18 12:50:22 crc kubenswrapper[4975]: I0318 12:50:22.474961 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:22 crc kubenswrapper[4975]: I0318 12:50:22.475149 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:22 crc kubenswrapper[4975]: I0318 12:50:22.530738 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:23 crc kubenswrapper[4975]: I0318 12:50:23.197970 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-khvmx" podUID="308e5e3a-6b04-4ec2-9b17-d2f1e9509f29" containerName="registry-server" containerID="cri-o://27e823e6cbfa68dc6784331590e67ad726637985bfb61479a47693193f085324" gracePeriod=2 Mar 18 12:50:23 crc kubenswrapper[4975]: I0318 12:50:23.263146 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:23 crc kubenswrapper[4975]: I0318 12:50:23.715649 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:23 crc kubenswrapper[4975]: I0318 12:50:23.787445 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-utilities\") pod \"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29\" (UID: \"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29\") " Mar 18 12:50:23 crc kubenswrapper[4975]: I0318 12:50:23.787532 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk6ft\" (UniqueName: \"kubernetes.io/projected/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-kube-api-access-wk6ft\") pod \"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29\" (UID: \"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29\") " Mar 18 12:50:23 crc kubenswrapper[4975]: I0318 12:50:23.787817 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-catalog-content\") pod \"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29\" (UID: \"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29\") " Mar 18 12:50:23 crc kubenswrapper[4975]: I0318 12:50:23.788382 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-utilities" (OuterVolumeSpecName: "utilities") pod "308e5e3a-6b04-4ec2-9b17-d2f1e9509f29" (UID: "308e5e3a-6b04-4ec2-9b17-d2f1e9509f29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:50:23 crc kubenswrapper[4975]: I0318 12:50:23.788537 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:50:23 crc kubenswrapper[4975]: I0318 12:50:23.794197 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-kube-api-access-wk6ft" (OuterVolumeSpecName: "kube-api-access-wk6ft") pod "308e5e3a-6b04-4ec2-9b17-d2f1e9509f29" (UID: "308e5e3a-6b04-4ec2-9b17-d2f1e9509f29"). InnerVolumeSpecName "kube-api-access-wk6ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:50:23 crc kubenswrapper[4975]: I0318 12:50:23.839034 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "308e5e3a-6b04-4ec2-9b17-d2f1e9509f29" (UID: "308e5e3a-6b04-4ec2-9b17-d2f1e9509f29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:50:23 crc kubenswrapper[4975]: I0318 12:50:23.890215 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk6ft\" (UniqueName: \"kubernetes.io/projected/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-kube-api-access-wk6ft\") on node \"crc\" DevicePath \"\"" Mar 18 12:50:23 crc kubenswrapper[4975]: I0318 12:50:23.890248 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.208296 4975 generic.go:334] "Generic (PLEG): container finished" podID="308e5e3a-6b04-4ec2-9b17-d2f1e9509f29" containerID="27e823e6cbfa68dc6784331590e67ad726637985bfb61479a47693193f085324" exitCode=0 Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.208369 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khvmx" Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.208415 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khvmx" event={"ID":"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29","Type":"ContainerDied","Data":"27e823e6cbfa68dc6784331590e67ad726637985bfb61479a47693193f085324"} Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.208485 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khvmx" event={"ID":"308e5e3a-6b04-4ec2-9b17-d2f1e9509f29","Type":"ContainerDied","Data":"c038b6f934a7ad9b39da63b481a061789fd86802823f0f85609841385551dca7"} Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.208512 4975 scope.go:117] "RemoveContainer" containerID="27e823e6cbfa68dc6784331590e67ad726637985bfb61479a47693193f085324" Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.241313 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-khvmx"] Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.242762 4975 scope.go:117] "RemoveContainer" containerID="ed73716ae73f942de0f59e0a041fb722fd7c5d9899bb6712bc44253c0dee4bfd" Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.251236 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-khvmx"] Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.270480 4975 scope.go:117] "RemoveContainer" containerID="317e80a04b5ca4d038da7b16a9395e4fb32a78c9e56df4a496da5aa16e739be0" Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.315120 4975 scope.go:117] "RemoveContainer" containerID="27e823e6cbfa68dc6784331590e67ad726637985bfb61479a47693193f085324" Mar 18 12:50:24 crc kubenswrapper[4975]: E0318 12:50:24.315582 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e823e6cbfa68dc6784331590e67ad726637985bfb61479a47693193f085324\": container with ID starting with 27e823e6cbfa68dc6784331590e67ad726637985bfb61479a47693193f085324 not found: ID does not exist" containerID="27e823e6cbfa68dc6784331590e67ad726637985bfb61479a47693193f085324" Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.315621 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e823e6cbfa68dc6784331590e67ad726637985bfb61479a47693193f085324"} err="failed to get container status \"27e823e6cbfa68dc6784331590e67ad726637985bfb61479a47693193f085324\": rpc error: code = NotFound desc = could not find container \"27e823e6cbfa68dc6784331590e67ad726637985bfb61479a47693193f085324\": container with ID starting with 27e823e6cbfa68dc6784331590e67ad726637985bfb61479a47693193f085324 not found: ID does not exist" Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.315644 4975 scope.go:117] "RemoveContainer" containerID="ed73716ae73f942de0f59e0a041fb722fd7c5d9899bb6712bc44253c0dee4bfd" Mar 18 12:50:24 crc kubenswrapper[4975]: E0318 12:50:24.316090 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed73716ae73f942de0f59e0a041fb722fd7c5d9899bb6712bc44253c0dee4bfd\": container with ID starting with ed73716ae73f942de0f59e0a041fb722fd7c5d9899bb6712bc44253c0dee4bfd not found: ID does not exist" containerID="ed73716ae73f942de0f59e0a041fb722fd7c5d9899bb6712bc44253c0dee4bfd" Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.316113 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed73716ae73f942de0f59e0a041fb722fd7c5d9899bb6712bc44253c0dee4bfd"} err="failed to get container status \"ed73716ae73f942de0f59e0a041fb722fd7c5d9899bb6712bc44253c0dee4bfd\": rpc error: code = NotFound desc = could not find container \"ed73716ae73f942de0f59e0a041fb722fd7c5d9899bb6712bc44253c0dee4bfd\": container with ID starting with ed73716ae73f942de0f59e0a041fb722fd7c5d9899bb6712bc44253c0dee4bfd not found: ID does not exist" Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.316126 4975 scope.go:117] "RemoveContainer" containerID="317e80a04b5ca4d038da7b16a9395e4fb32a78c9e56df4a496da5aa16e739be0" Mar 18 12:50:24 crc kubenswrapper[4975]: E0318 12:50:24.316417 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"317e80a04b5ca4d038da7b16a9395e4fb32a78c9e56df4a496da5aa16e739be0\": container with ID starting with 317e80a04b5ca4d038da7b16a9395e4fb32a78c9e56df4a496da5aa16e739be0 not found: ID does not exist" containerID="317e80a04b5ca4d038da7b16a9395e4fb32a78c9e56df4a496da5aa16e739be0" Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.316438 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"317e80a04b5ca4d038da7b16a9395e4fb32a78c9e56df4a496da5aa16e739be0"} err="failed to get container status \"317e80a04b5ca4d038da7b16a9395e4fb32a78c9e56df4a496da5aa16e739be0\": rpc error: code = NotFound desc = could not find container \"317e80a04b5ca4d038da7b16a9395e4fb32a78c9e56df4a496da5aa16e739be0\": container with ID starting with 317e80a04b5ca4d038da7b16a9395e4fb32a78c9e56df4a496da5aa16e739be0 not found: ID does not exist" Mar 18 12:50:24 crc kubenswrapper[4975]: I0318 12:50:24.880814 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9jks"] Mar 18 12:50:25 crc kubenswrapper[4975]: I0318 12:50:25.032045 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308e5e3a-6b04-4ec2-9b17-d2f1e9509f29" path="/var/lib/kubelet/pods/308e5e3a-6b04-4ec2-9b17-d2f1e9509f29/volumes" Mar 18 12:50:26 crc kubenswrapper[4975]: I0318 12:50:26.227574 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r9jks" podUID="c13265c6-9ee4-4911-823e-1219c13f7c68" containerName="registry-server" containerID="cri-o://f414a486273e0aa7f366813a1222f727bb0fbb0f4ee6e420c33e07f82eac5098" gracePeriod=2 Mar 18 12:50:26 crc kubenswrapper[4975]: I0318 12:50:26.682243 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:26 crc kubenswrapper[4975]: I0318 12:50:26.763041 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c13265c6-9ee4-4911-823e-1219c13f7c68-utilities\") pod \"c13265c6-9ee4-4911-823e-1219c13f7c68\" (UID: \"c13265c6-9ee4-4911-823e-1219c13f7c68\") " Mar 18 12:50:26 crc kubenswrapper[4975]: I0318 12:50:26.763278 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr9m9\" (UniqueName: \"kubernetes.io/projected/c13265c6-9ee4-4911-823e-1219c13f7c68-kube-api-access-gr9m9\") pod \"c13265c6-9ee4-4911-823e-1219c13f7c68\" (UID: \"c13265c6-9ee4-4911-823e-1219c13f7c68\") " Mar 18 12:50:26 crc kubenswrapper[4975]: I0318 12:50:26.763377 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c13265c6-9ee4-4911-823e-1219c13f7c68-catalog-content\") pod \"c13265c6-9ee4-4911-823e-1219c13f7c68\" (UID: \"c13265c6-9ee4-4911-823e-1219c13f7c68\") " Mar 18 12:50:26 crc kubenswrapper[4975]: I0318 12:50:26.764172 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c13265c6-9ee4-4911-823e-1219c13f7c68-utilities" (OuterVolumeSpecName: "utilities") pod "c13265c6-9ee4-4911-823e-1219c13f7c68" (UID: "c13265c6-9ee4-4911-823e-1219c13f7c68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:50:26 crc kubenswrapper[4975]: I0318 12:50:26.772286 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13265c6-9ee4-4911-823e-1219c13f7c68-kube-api-access-gr9m9" (OuterVolumeSpecName: "kube-api-access-gr9m9") pod "c13265c6-9ee4-4911-823e-1219c13f7c68" (UID: "c13265c6-9ee4-4911-823e-1219c13f7c68"). InnerVolumeSpecName "kube-api-access-gr9m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:50:26 crc kubenswrapper[4975]: I0318 12:50:26.788800 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c13265c6-9ee4-4911-823e-1219c13f7c68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c13265c6-9ee4-4911-823e-1219c13f7c68" (UID: "c13265c6-9ee4-4911-823e-1219c13f7c68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:50:26 crc kubenswrapper[4975]: I0318 12:50:26.866132 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c13265c6-9ee4-4911-823e-1219c13f7c68-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:50:26 crc kubenswrapper[4975]: I0318 12:50:26.866174 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c13265c6-9ee4-4911-823e-1219c13f7c68-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:50:26 crc kubenswrapper[4975]: I0318 12:50:26.866189 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr9m9\" (UniqueName: \"kubernetes.io/projected/c13265c6-9ee4-4911-823e-1219c13f7c68-kube-api-access-gr9m9\") on node \"crc\" DevicePath \"\"" Mar 18 12:50:27 crc kubenswrapper[4975]: I0318 12:50:27.239255 4975 generic.go:334] "Generic (PLEG): container finished" podID="c13265c6-9ee4-4911-823e-1219c13f7c68" containerID="f414a486273e0aa7f366813a1222f727bb0fbb0f4ee6e420c33e07f82eac5098" exitCode=0 Mar 18 12:50:27 crc kubenswrapper[4975]: I0318 12:50:27.239324 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9jks" event={"ID":"c13265c6-9ee4-4911-823e-1219c13f7c68","Type":"ContainerDied","Data":"f414a486273e0aa7f366813a1222f727bb0fbb0f4ee6e420c33e07f82eac5098"} Mar 18 12:50:27 crc kubenswrapper[4975]: I0318 12:50:27.239366 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9jks" event={"ID":"c13265c6-9ee4-4911-823e-1219c13f7c68","Type":"ContainerDied","Data":"05f46efaf23867d9b0f7c7246f88b9d001a302a5b9eaaae6ba1d06fdbae7bd9e"} Mar 18 12:50:27 crc kubenswrapper[4975]: I0318 12:50:27.239400 4975 scope.go:117] "RemoveContainer" containerID="f414a486273e0aa7f366813a1222f727bb0fbb0f4ee6e420c33e07f82eac5098" Mar 18 12:50:27 crc kubenswrapper[4975]: I0318 12:50:27.239331 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9jks" Mar 18 12:50:27 crc kubenswrapper[4975]: I0318 12:50:27.267034 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9jks"] Mar 18 12:50:27 crc kubenswrapper[4975]: I0318 12:50:27.273693 4975 scope.go:117] "RemoveContainer" containerID="358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd" Mar 18 12:50:27 crc kubenswrapper[4975]: I0318 12:50:27.275659 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9jks"] Mar 18 12:50:27 crc kubenswrapper[4975]: I0318 12:50:27.295014 4975 scope.go:117] "RemoveContainer" containerID="b770bd5530e54ca648900cbabbb44a8352f83a10265f5e5fd1cb74d870fcbacf" Mar 18 12:50:27 crc kubenswrapper[4975]: I0318 12:50:27.348270 4975 scope.go:117] "RemoveContainer" containerID="f414a486273e0aa7f366813a1222f727bb0fbb0f4ee6e420c33e07f82eac5098" Mar 18 12:50:27 crc kubenswrapper[4975]: E0318 12:50:27.348834 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f414a486273e0aa7f366813a1222f727bb0fbb0f4ee6e420c33e07f82eac5098\": container with ID starting with f414a486273e0aa7f366813a1222f727bb0fbb0f4ee6e420c33e07f82eac5098 not found: ID does not exist" containerID="f414a486273e0aa7f366813a1222f727bb0fbb0f4ee6e420c33e07f82eac5098" Mar 18 12:50:27 crc kubenswrapper[4975]: I0318 12:50:27.348948 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f414a486273e0aa7f366813a1222f727bb0fbb0f4ee6e420c33e07f82eac5098"} err="failed to get container status \"f414a486273e0aa7f366813a1222f727bb0fbb0f4ee6e420c33e07f82eac5098\": rpc error: code = NotFound desc = could not find container \"f414a486273e0aa7f366813a1222f727bb0fbb0f4ee6e420c33e07f82eac5098\": container with ID starting with f414a486273e0aa7f366813a1222f727bb0fbb0f4ee6e420c33e07f82eac5098 not found: ID does not exist" Mar 18 12:50:27 crc kubenswrapper[4975]: I0318 12:50:27.348985 4975 scope.go:117] "RemoveContainer" containerID="358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd" Mar 18 12:50:27 crc kubenswrapper[4975]: E0318 12:50:27.349486 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd\": container with ID starting with 358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd not found: ID does not exist" containerID="358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd" Mar 18 12:50:27 crc kubenswrapper[4975]: I0318 12:50:27.349527 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd"} err="failed to get container status \"358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd\": rpc error: code = NotFound desc = could not find container \"358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd\": container with ID starting with 358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd not found: ID does not exist" Mar 18 12:50:27 crc kubenswrapper[4975]: I0318 12:50:27.349560 4975 scope.go:117] "RemoveContainer" containerID="b770bd5530e54ca648900cbabbb44a8352f83a10265f5e5fd1cb74d870fcbacf" Mar 18 12:50:27 crc kubenswrapper[4975]: E0318 12:50:27.349887 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b770bd5530e54ca648900cbabbb44a8352f83a10265f5e5fd1cb74d870fcbacf\": container with ID starting with b770bd5530e54ca648900cbabbb44a8352f83a10265f5e5fd1cb74d870fcbacf not found: ID does not exist" containerID="b770bd5530e54ca648900cbabbb44a8352f83a10265f5e5fd1cb74d870fcbacf" Mar 18 12:50:27 crc kubenswrapper[4975]: I0318 12:50:27.349916 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b770bd5530e54ca648900cbabbb44a8352f83a10265f5e5fd1cb74d870fcbacf"} err="failed to get container status \"b770bd5530e54ca648900cbabbb44a8352f83a10265f5e5fd1cb74d870fcbacf\": rpc error: code = NotFound desc = could not find container \"b770bd5530e54ca648900cbabbb44a8352f83a10265f5e5fd1cb74d870fcbacf\": container with ID starting with b770bd5530e54ca648900cbabbb44a8352f83a10265f5e5fd1cb74d870fcbacf not found: ID does not exist" Mar 18 12:50:29 crc kubenswrapper[4975]: I0318 12:50:29.030577 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c13265c6-9ee4-4911-823e-1219c13f7c68" path="/var/lib/kubelet/pods/c13265c6-9ee4-4911-823e-1219c13f7c68/volumes" Mar 18 12:50:30 crc kubenswrapper[4975]: E0318 12:50:30.426026 4975 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13265c6_9ee4_4911_823e_1219c13f7c68.slice/crio-conmon-358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13265c6_9ee4_4911_823e_1219c13f7c68.slice/crio-358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:50:34 crc kubenswrapper[4975]: I0318 12:50:34.016530 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:50:34 crc kubenswrapper[4975]: E0318 12:50:34.017498 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:50:40 crc kubenswrapper[4975]: E0318 12:50:40.657456 4975 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13265c6_9ee4_4911_823e_1219c13f7c68.slice/crio-conmon-358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13265c6_9ee4_4911_823e_1219c13f7c68.slice/crio-358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:50:47 crc kubenswrapper[4975]: I0318 12:50:47.422188 4975 scope.go:117] "RemoveContainer" containerID="15d8a047563c19f878e2071f8ba3ec0280a5f58e1dc9b03018dc7f7de18d246d" Mar 18 12:50:48 crc kubenswrapper[4975]: I0318 12:50:48.016366 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:50:48 crc kubenswrapper[4975]: E0318 12:50:48.017072 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:50:50 crc kubenswrapper[4975]: E0318 12:50:50.931999 4975 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13265c6_9ee4_4911_823e_1219c13f7c68.slice/crio-conmon-358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13265c6_9ee4_4911_823e_1219c13f7c68.slice/crio-358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:51:01 crc kubenswrapper[4975]: E0318 12:51:01.188314 4975 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13265c6_9ee4_4911_823e_1219c13f7c68.slice/crio-358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13265c6_9ee4_4911_823e_1219c13f7c68.slice/crio-conmon-358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:51:03 crc kubenswrapper[4975]: I0318 12:51:03.017262 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:51:03 crc kubenswrapper[4975]: E0318 12:51:03.017929 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:51:09 crc kubenswrapper[4975]: I0318 12:51:09.677724 4975 generic.go:334] "Generic (PLEG): container finished" podID="ebc50dc0-b3e2-49ba-a6cd-958fc27292ac" containerID="42ec307fd9a8d8e03e44ea5fecb1502e7a83350d7d377a4bfa1588fd981f0391" exitCode=0 Mar 18 12:51:09 crc kubenswrapper[4975]: I0318 12:51:09.677843 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" event={"ID":"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac","Type":"ContainerDied","Data":"42ec307fd9a8d8e03e44ea5fecb1502e7a83350d7d377a4bfa1588fd981f0391"} Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.132800 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.292311 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk924\" (UniqueName: \"kubernetes.io/projected/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-kube-api-access-qk924\") pod \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.292953 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-inventory\") pod \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.293522 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-nova-metadata-neutron-config-0\") pod \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.294010 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-ssh-key-openstack-edpm-ipam\") pod \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.294319 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.295422 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-neutron-metadata-combined-ca-bundle\") pod \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\" (UID: \"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac\") " Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.300025 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-kube-api-access-qk924" (OuterVolumeSpecName: "kube-api-access-qk924") pod "ebc50dc0-b3e2-49ba-a6cd-958fc27292ac" (UID: "ebc50dc0-b3e2-49ba-a6cd-958fc27292ac"). InnerVolumeSpecName "kube-api-access-qk924". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.302774 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ebc50dc0-b3e2-49ba-a6cd-958fc27292ac" (UID: "ebc50dc0-b3e2-49ba-a6cd-958fc27292ac"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.326184 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ebc50dc0-b3e2-49ba-a6cd-958fc27292ac" (UID: "ebc50dc0-b3e2-49ba-a6cd-958fc27292ac"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.328502 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-inventory" (OuterVolumeSpecName: "inventory") pod "ebc50dc0-b3e2-49ba-a6cd-958fc27292ac" (UID: "ebc50dc0-b3e2-49ba-a6cd-958fc27292ac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.337201 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ebc50dc0-b3e2-49ba-a6cd-958fc27292ac" (UID: "ebc50dc0-b3e2-49ba-a6cd-958fc27292ac"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.355493 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ebc50dc0-b3e2-49ba-a6cd-958fc27292ac" (UID: "ebc50dc0-b3e2-49ba-a6cd-958fc27292ac"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.397988 4975 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.398020 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk924\" (UniqueName: \"kubernetes.io/projected/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-kube-api-access-qk924\") on node \"crc\" DevicePath \"\"" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.398033 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.398041 4975 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.398050 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.398059 4975 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ebc50dc0-b3e2-49ba-a6cd-958fc27292ac-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:51:11 crc kubenswrapper[4975]: E0318 12:51:11.429839 4975 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13265c6_9ee4_4911_823e_1219c13f7c68.slice/crio-conmon-358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13265c6_9ee4_4911_823e_1219c13f7c68.slice/crio-358141ca99a0dda6f728252850ebf9822b59c4a8871ac055f3911963a752acfd.scope\": RecentStats: unable to find data in memory cache]" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.699962 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" event={"ID":"ebc50dc0-b3e2-49ba-a6cd-958fc27292ac","Type":"ContainerDied","Data":"1d719c61915725701b627e06aa1fe7629ec26660619a8519bd52371a84a259a5"} Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.700358 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d719c61915725701b627e06aa1fe7629ec26660619a8519bd52371a84a259a5" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.700047 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.913492 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx"] Mar 18 12:51:11 crc kubenswrapper[4975]: E0318 12:51:11.914298 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="308e5e3a-6b04-4ec2-9b17-d2f1e9509f29" containerName="extract-content" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.914460 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="308e5e3a-6b04-4ec2-9b17-d2f1e9509f29" containerName="extract-content" Mar 18 12:51:11 crc kubenswrapper[4975]: E0318 12:51:11.914576 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13265c6-9ee4-4911-823e-1219c13f7c68" containerName="registry-server" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.914659 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13265c6-9ee4-4911-823e-1219c13f7c68" containerName="registry-server" Mar 18 12:51:11 crc kubenswrapper[4975]: E0318 12:51:11.914741 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebc50dc0-b3e2-49ba-a6cd-958fc27292ac" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.914809 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebc50dc0-b3e2-49ba-a6cd-958fc27292ac" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 12:51:11 crc kubenswrapper[4975]: E0318 12:51:11.914934 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="308e5e3a-6b04-4ec2-9b17-d2f1e9509f29" containerName="extract-utilities" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.915041 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="308e5e3a-6b04-4ec2-9b17-d2f1e9509f29" containerName="extract-utilities" Mar 18 12:51:11 crc kubenswrapper[4975]: E0318 12:51:11.915347 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13265c6-9ee4-4911-823e-1219c13f7c68" containerName="extract-content" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.915440 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13265c6-9ee4-4911-823e-1219c13f7c68" containerName="extract-content" Mar 18 12:51:11 crc kubenswrapper[4975]: E0318 12:51:11.915523 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13265c6-9ee4-4911-823e-1219c13f7c68" containerName="extract-utilities" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.915607 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13265c6-9ee4-4911-823e-1219c13f7c68" containerName="extract-utilities" Mar 18 12:51:11 crc kubenswrapper[4975]: E0318 12:51:11.915700 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="308e5e3a-6b04-4ec2-9b17-d2f1e9509f29" containerName="registry-server" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.915914 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="308e5e3a-6b04-4ec2-9b17-d2f1e9509f29" containerName="registry-server" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.916242 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13265c6-9ee4-4911-823e-1219c13f7c68" containerName="registry-server" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.916322 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebc50dc0-b3e2-49ba-a6cd-958fc27292ac" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.916398 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="308e5e3a-6b04-4ec2-9b17-d2f1e9509f29" containerName="registry-server" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.917150 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.922425 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx"] Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.923282 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.923295 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.923609 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.923651 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:51:11 crc kubenswrapper[4975]: I0318 12:51:11.923828 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.007904 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.007975 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.008090 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h42lb\" (UniqueName: \"kubernetes.io/projected/63653681-d7c8-4201-9922-32bfe87bc28f-kube-api-access-h42lb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.008120 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.008175 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.109209 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.109641 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.110051 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.110627 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h42lb\" (UniqueName: \"kubernetes.io/projected/63653681-d7c8-4201-9922-32bfe87bc28f-kube-api-access-h42lb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.110712 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.114431 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.115457 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.118381 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.118438 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.129684 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h42lb\" (UniqueName: \"kubernetes.io/projected/63653681-d7c8-4201-9922-32bfe87bc28f-kube-api-access-h42lb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.245774 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:51:12 crc kubenswrapper[4975]: I0318 12:51:12.742321 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx"] Mar 18 12:51:13 crc kubenswrapper[4975]: I0318 12:51:13.720177 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" event={"ID":"63653681-d7c8-4201-9922-32bfe87bc28f","Type":"ContainerStarted","Data":"7f032cb30f7daea23026400fd850a94e5c2738e737868b4cfced42f612c6eeb3"} Mar 18 12:51:13 crc kubenswrapper[4975]: I0318 12:51:13.720226 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" event={"ID":"63653681-d7c8-4201-9922-32bfe87bc28f","Type":"ContainerStarted","Data":"2ef94e2e3a6e079e7b0e5e430d8e5d67eba646112edb16eb4c1b449284b61d3f"} Mar 18 12:51:13 crc kubenswrapper[4975]: I0318 12:51:13.747690 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" podStartSLOduration=2.436837003 podStartE2EDuration="2.747644887s" podCreationTimestamp="2026-03-18 12:51:11 +0000 UTC" firstStartedPulling="2026-03-18 12:51:12.765847251 +0000 UTC m=+2458.480247830" lastFinishedPulling="2026-03-18 12:51:13.076655135 +0000 UTC m=+2458.791055714" observedRunningTime="2026-03-18 12:51:13.738521479 +0000 UTC m=+2459.452922068" watchObservedRunningTime="2026-03-18 12:51:13.747644887 +0000 UTC m=+2459.462045476" Mar 18 12:51:14 crc kubenswrapper[4975]: I0318 12:51:14.016266 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:51:14 crc kubenswrapper[4975]: E0318 12:51:14.016530 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:51:27 crc kubenswrapper[4975]: I0318 12:51:27.016784 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:51:27 crc kubenswrapper[4975]: E0318 12:51:27.017581 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:51:41 crc kubenswrapper[4975]: I0318 12:51:41.016845 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:51:41 crc kubenswrapper[4975]: E0318 12:51:41.017598 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:51:52 crc kubenswrapper[4975]: I0318 12:51:52.016810 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:51:52 crc kubenswrapper[4975]: E0318 12:51:52.017646 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:51:54 crc kubenswrapper[4975]: I0318 12:51:54.112699 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pzg4x"] Mar 18 12:51:54 crc kubenswrapper[4975]: I0318 12:51:54.115686 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:51:54 crc kubenswrapper[4975]: I0318 12:51:54.124698 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pzg4x"] Mar 18 12:51:54 crc kubenswrapper[4975]: I0318 12:51:54.201658 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c608e142-2e3d-4551-858b-771b250b3b12-utilities\") pod \"certified-operators-pzg4x\" (UID: \"c608e142-2e3d-4551-858b-771b250b3b12\") " pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:51:54 crc kubenswrapper[4975]: I0318 12:51:54.201901 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd6x2\" (UniqueName: \"kubernetes.io/projected/c608e142-2e3d-4551-858b-771b250b3b12-kube-api-access-kd6x2\") pod \"certified-operators-pzg4x\" (UID: \"c608e142-2e3d-4551-858b-771b250b3b12\") " pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:51:54 crc kubenswrapper[4975]: I0318 12:51:54.202009 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c608e142-2e3d-4551-858b-771b250b3b12-catalog-content\") pod \"certified-operators-pzg4x\" (UID: \"c608e142-2e3d-4551-858b-771b250b3b12\") " pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:51:54 crc kubenswrapper[4975]: I0318 12:51:54.303460 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c608e142-2e3d-4551-858b-771b250b3b12-utilities\") pod \"certified-operators-pzg4x\" (UID: \"c608e142-2e3d-4551-858b-771b250b3b12\") " pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:51:54 crc kubenswrapper[4975]: I0318 12:51:54.303549 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd6x2\" (UniqueName: \"kubernetes.io/projected/c608e142-2e3d-4551-858b-771b250b3b12-kube-api-access-kd6x2\") pod \"certified-operators-pzg4x\" (UID: \"c608e142-2e3d-4551-858b-771b250b3b12\") " pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:51:54 crc kubenswrapper[4975]: I0318 12:51:54.303578 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c608e142-2e3d-4551-858b-771b250b3b12-catalog-content\") pod \"certified-operators-pzg4x\" (UID: \"c608e142-2e3d-4551-858b-771b250b3b12\") " pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:51:54 crc kubenswrapper[4975]: I0318 12:51:54.304000 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c608e142-2e3d-4551-858b-771b250b3b12-utilities\") pod \"certified-operators-pzg4x\" (UID: \"c608e142-2e3d-4551-858b-771b250b3b12\") " pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:51:54 crc kubenswrapper[4975]: I0318 12:51:54.304033 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c608e142-2e3d-4551-858b-771b250b3b12-catalog-content\") pod \"certified-operators-pzg4x\" (UID: \"c608e142-2e3d-4551-858b-771b250b3b12\") " pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:51:54 crc kubenswrapper[4975]: I0318 12:51:54.328130 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd6x2\" (UniqueName: \"kubernetes.io/projected/c608e142-2e3d-4551-858b-771b250b3b12-kube-api-access-kd6x2\") pod \"certified-operators-pzg4x\" (UID: \"c608e142-2e3d-4551-858b-771b250b3b12\") " pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:51:54 crc kubenswrapper[4975]: I0318 12:51:54.436490 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:51:54 crc kubenswrapper[4975]: I0318 12:51:54.971152 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pzg4x"] Mar 18 12:51:55 crc kubenswrapper[4975]: I0318 12:51:55.112680 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzg4x" event={"ID":"c608e142-2e3d-4551-858b-771b250b3b12","Type":"ContainerStarted","Data":"12e1c151e1ace8199a43852112c6261b8b6426786e6b43ad786a573e456dcad1"} Mar 18 12:51:56 crc kubenswrapper[4975]: I0318 12:51:56.125431 4975 generic.go:334] "Generic (PLEG): container finished" podID="c608e142-2e3d-4551-858b-771b250b3b12" containerID="71854f3b1963372fffda3a3ac1d6407bb648fb144065684587eaa5da21f9bece" exitCode=0 Mar 18 12:51:56 crc kubenswrapper[4975]: I0318 12:51:56.125513 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzg4x" event={"ID":"c608e142-2e3d-4551-858b-771b250b3b12","Type":"ContainerDied","Data":"71854f3b1963372fffda3a3ac1d6407bb648fb144065684587eaa5da21f9bece"} Mar 18 12:51:56 crc kubenswrapper[4975]: I0318 12:51:56.128718 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:51:58 crc kubenswrapper[4975]: I0318 12:51:58.144342 4975 generic.go:334] "Generic (PLEG): container finished" podID="c608e142-2e3d-4551-858b-771b250b3b12" containerID="9b5893fde0fa58145791a241d8d35f910990e969b19a5096dae4ad6213738c49" exitCode=0 Mar 18 12:51:58 crc kubenswrapper[4975]: I0318 12:51:58.144402 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzg4x" event={"ID":"c608e142-2e3d-4551-858b-771b250b3b12","Type":"ContainerDied","Data":"9b5893fde0fa58145791a241d8d35f910990e969b19a5096dae4ad6213738c49"} Mar 18 12:51:59 crc kubenswrapper[4975]: I0318 12:51:59.156002 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzg4x" event={"ID":"c608e142-2e3d-4551-858b-771b250b3b12","Type":"ContainerStarted","Data":"c92988aab2066e8ef26cf0a3427498680761a8fcc5191c9a89f0198612af66fc"} Mar 18 12:52:00 crc kubenswrapper[4975]: I0318 12:52:00.144036 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pzg4x" podStartSLOduration=3.53264494 podStartE2EDuration="6.14399999s" podCreationTimestamp="2026-03-18 12:51:54 +0000 UTC" firstStartedPulling="2026-03-18 12:51:56.12836157 +0000 UTC m=+2501.842762169" lastFinishedPulling="2026-03-18 12:51:58.73971664 +0000 UTC m=+2504.454117219" observedRunningTime="2026-03-18 12:51:59.17855688 +0000 UTC m=+2504.892957459" watchObservedRunningTime="2026-03-18 12:52:00.14399999 +0000 UTC m=+2505.858400579" Mar 18 12:52:00 crc kubenswrapper[4975]: I0318 12:52:00.155631 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563972-s4w58"] Mar 18 12:52:00 crc kubenswrapper[4975]: I0318 12:52:00.157749 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563972-s4w58" Mar 18 12:52:00 crc kubenswrapper[4975]: I0318 12:52:00.163655 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:52:00 crc kubenswrapper[4975]: I0318 12:52:00.163977 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:52:00 crc kubenswrapper[4975]: I0318 12:52:00.164033 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:52:00 crc kubenswrapper[4975]: I0318 12:52:00.165611 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563972-s4w58"] Mar 18 12:52:00 crc kubenswrapper[4975]: I0318 12:52:00.223557 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjrxq\" (UniqueName: \"kubernetes.io/projected/9d350b93-c744-4449-bb20-35210bb3a1f2-kube-api-access-fjrxq\") pod \"auto-csr-approver-29563972-s4w58\" (UID: \"9d350b93-c744-4449-bb20-35210bb3a1f2\") " pod="openshift-infra/auto-csr-approver-29563972-s4w58" Mar 18 12:52:00 crc kubenswrapper[4975]: I0318 12:52:00.325799 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjrxq\" (UniqueName: \"kubernetes.io/projected/9d350b93-c744-4449-bb20-35210bb3a1f2-kube-api-access-fjrxq\") pod \"auto-csr-approver-29563972-s4w58\" (UID: \"9d350b93-c744-4449-bb20-35210bb3a1f2\") " pod="openshift-infra/auto-csr-approver-29563972-s4w58" Mar 18 12:52:00 crc kubenswrapper[4975]: I0318 12:52:00.354394 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjrxq\" (UniqueName: \"kubernetes.io/projected/9d350b93-c744-4449-bb20-35210bb3a1f2-kube-api-access-fjrxq\") pod \"auto-csr-approver-29563972-s4w58\" (UID: \"9d350b93-c744-4449-bb20-35210bb3a1f2\") " pod="openshift-infra/auto-csr-approver-29563972-s4w58" Mar 18 12:52:00 crc kubenswrapper[4975]: I0318 12:52:00.555558 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563972-s4w58" Mar 18 12:52:01 crc kubenswrapper[4975]: W0318 12:52:01.047092 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d350b93_c744_4449_bb20_35210bb3a1f2.slice/crio-2f2c001887376ea7e91c2c51deae9e403f095165f52e815a3bf5a22f24f16603 WatchSource:0}: Error finding container 2f2c001887376ea7e91c2c51deae9e403f095165f52e815a3bf5a22f24f16603: Status 404 returned error can't find the container with id 2f2c001887376ea7e91c2c51deae9e403f095165f52e815a3bf5a22f24f16603 Mar 18 12:52:01 crc kubenswrapper[4975]: I0318 12:52:01.053402 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563972-s4w58"] Mar 18 12:52:01 crc kubenswrapper[4975]: I0318 12:52:01.176070 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563972-s4w58" event={"ID":"9d350b93-c744-4449-bb20-35210bb3a1f2","Type":"ContainerStarted","Data":"2f2c001887376ea7e91c2c51deae9e403f095165f52e815a3bf5a22f24f16603"} Mar 18 12:52:04 crc kubenswrapper[4975]: I0318 12:52:04.207988 4975 generic.go:334] "Generic (PLEG): container finished" podID="9d350b93-c744-4449-bb20-35210bb3a1f2" containerID="5ce72b4fd6712fc1dddf3ba25c70fa2c32076464941898322f14dfb1a7696560" exitCode=0 Mar 18 12:52:04 crc kubenswrapper[4975]: I0318 12:52:04.208385 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563972-s4w58" event={"ID":"9d350b93-c744-4449-bb20-35210bb3a1f2","Type":"ContainerDied","Data":"5ce72b4fd6712fc1dddf3ba25c70fa2c32076464941898322f14dfb1a7696560"} Mar 18 12:52:04 crc kubenswrapper[4975]: I0318 12:52:04.436645 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:52:04 crc kubenswrapper[4975]: I0318 12:52:04.436764 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:52:04 crc kubenswrapper[4975]: I0318 12:52:04.488203 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:52:05 crc kubenswrapper[4975]: I0318 12:52:05.277714 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:52:05 crc kubenswrapper[4975]: I0318 12:52:05.352414 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pzg4x"] Mar 18 12:52:05 crc kubenswrapper[4975]: I0318 12:52:05.565703 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563972-s4w58" Mar 18 12:52:05 crc kubenswrapper[4975]: I0318 12:52:05.660051 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjrxq\" (UniqueName: \"kubernetes.io/projected/9d350b93-c744-4449-bb20-35210bb3a1f2-kube-api-access-fjrxq\") pod \"9d350b93-c744-4449-bb20-35210bb3a1f2\" (UID: \"9d350b93-c744-4449-bb20-35210bb3a1f2\") " Mar 18 12:52:05 crc kubenswrapper[4975]: I0318 12:52:05.665876 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d350b93-c744-4449-bb20-35210bb3a1f2-kube-api-access-fjrxq" (OuterVolumeSpecName: "kube-api-access-fjrxq") pod "9d350b93-c744-4449-bb20-35210bb3a1f2" (UID: "9d350b93-c744-4449-bb20-35210bb3a1f2"). InnerVolumeSpecName "kube-api-access-fjrxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:52:05 crc kubenswrapper[4975]: I0318 12:52:05.762462 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjrxq\" (UniqueName: \"kubernetes.io/projected/9d350b93-c744-4449-bb20-35210bb3a1f2-kube-api-access-fjrxq\") on node \"crc\" DevicePath \"\"" Mar 18 12:52:06 crc kubenswrapper[4975]: I0318 12:52:06.016434 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:52:06 crc kubenswrapper[4975]: E0318 12:52:06.016804 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:52:06 crc kubenswrapper[4975]: I0318 12:52:06.227578 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563972-s4w58" event={"ID":"9d350b93-c744-4449-bb20-35210bb3a1f2","Type":"ContainerDied","Data":"2f2c001887376ea7e91c2c51deae9e403f095165f52e815a3bf5a22f24f16603"} Mar 18 12:52:06 crc kubenswrapper[4975]: I0318 12:52:06.227629 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563972-s4w58" Mar 18 12:52:06 crc kubenswrapper[4975]: I0318 12:52:06.227815 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f2c001887376ea7e91c2c51deae9e403f095165f52e815a3bf5a22f24f16603" Mar 18 12:52:06 crc kubenswrapper[4975]: I0318 12:52:06.646671 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563966-g2gkq"] Mar 18 12:52:06 crc kubenswrapper[4975]: I0318 12:52:06.653350 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563966-g2gkq"] Mar 18 12:52:07 crc kubenswrapper[4975]: I0318 12:52:07.025954 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcb54f59-ccc7-48db-a359-7eb65c270001" path="/var/lib/kubelet/pods/fcb54f59-ccc7-48db-a359-7eb65c270001/volumes" Mar 18 12:52:07 crc kubenswrapper[4975]: I0318 12:52:07.235994 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pzg4x" podUID="c608e142-2e3d-4551-858b-771b250b3b12" containerName="registry-server" containerID="cri-o://c92988aab2066e8ef26cf0a3427498680761a8fcc5191c9a89f0198612af66fc" gracePeriod=2 Mar 18 12:52:08 crc kubenswrapper[4975]: I0318 12:52:08.263641 4975 generic.go:334] "Generic (PLEG): container finished" podID="c608e142-2e3d-4551-858b-771b250b3b12" containerID="c92988aab2066e8ef26cf0a3427498680761a8fcc5191c9a89f0198612af66fc" exitCode=0 Mar 18 12:52:08 crc kubenswrapper[4975]: I0318 12:52:08.263812 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzg4x" event={"ID":"c608e142-2e3d-4551-858b-771b250b3b12","Type":"ContainerDied","Data":"c92988aab2066e8ef26cf0a3427498680761a8fcc5191c9a89f0198612af66fc"} Mar 18 12:52:08 crc kubenswrapper[4975]: I0318 12:52:08.440292 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:52:08 crc kubenswrapper[4975]: I0318 12:52:08.637715 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c608e142-2e3d-4551-858b-771b250b3b12-utilities\") pod \"c608e142-2e3d-4551-858b-771b250b3b12\" (UID: \"c608e142-2e3d-4551-858b-771b250b3b12\") " Mar 18 12:52:08 crc kubenswrapper[4975]: I0318 12:52:08.637776 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c608e142-2e3d-4551-858b-771b250b3b12-catalog-content\") pod \"c608e142-2e3d-4551-858b-771b250b3b12\" (UID: \"c608e142-2e3d-4551-858b-771b250b3b12\") " Mar 18 12:52:08 crc kubenswrapper[4975]: I0318 12:52:08.637808 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd6x2\" (UniqueName: \"kubernetes.io/projected/c608e142-2e3d-4551-858b-771b250b3b12-kube-api-access-kd6x2\") pod \"c608e142-2e3d-4551-858b-771b250b3b12\" (UID: \"c608e142-2e3d-4551-858b-771b250b3b12\") " Mar 18 12:52:08 crc kubenswrapper[4975]: I0318 12:52:08.639624 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c608e142-2e3d-4551-858b-771b250b3b12-utilities" (OuterVolumeSpecName: "utilities") pod "c608e142-2e3d-4551-858b-771b250b3b12" (UID: "c608e142-2e3d-4551-858b-771b250b3b12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:52:08 crc kubenswrapper[4975]: I0318 12:52:08.645359 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c608e142-2e3d-4551-858b-771b250b3b12-kube-api-access-kd6x2" (OuterVolumeSpecName: "kube-api-access-kd6x2") pod "c608e142-2e3d-4551-858b-771b250b3b12" (UID: "c608e142-2e3d-4551-858b-771b250b3b12"). InnerVolumeSpecName "kube-api-access-kd6x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:52:08 crc kubenswrapper[4975]: I0318 12:52:08.740557 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd6x2\" (UniqueName: \"kubernetes.io/projected/c608e142-2e3d-4551-858b-771b250b3b12-kube-api-access-kd6x2\") on node \"crc\" DevicePath \"\"" Mar 18 12:52:08 crc kubenswrapper[4975]: I0318 12:52:08.740615 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c608e142-2e3d-4551-858b-771b250b3b12-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:52:09 crc kubenswrapper[4975]: I0318 12:52:09.081832 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c608e142-2e3d-4551-858b-771b250b3b12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c608e142-2e3d-4551-858b-771b250b3b12" (UID: "c608e142-2e3d-4551-858b-771b250b3b12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:52:09 crc kubenswrapper[4975]: I0318 12:52:09.149657 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c608e142-2e3d-4551-858b-771b250b3b12-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:52:09 crc kubenswrapper[4975]: I0318 12:52:09.277274 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pzg4x" event={"ID":"c608e142-2e3d-4551-858b-771b250b3b12","Type":"ContainerDied","Data":"12e1c151e1ace8199a43852112c6261b8b6426786e6b43ad786a573e456dcad1"} Mar 18 12:52:09 crc kubenswrapper[4975]: I0318 12:52:09.277351 4975 scope.go:117] "RemoveContainer" containerID="c92988aab2066e8ef26cf0a3427498680761a8fcc5191c9a89f0198612af66fc" Mar 18 12:52:09 crc kubenswrapper[4975]: I0318 12:52:09.277552 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pzg4x" Mar 18 12:52:09 crc kubenswrapper[4975]: I0318 12:52:09.309070 4975 scope.go:117] "RemoveContainer" containerID="9b5893fde0fa58145791a241d8d35f910990e969b19a5096dae4ad6213738c49" Mar 18 12:52:09 crc kubenswrapper[4975]: I0318 12:52:09.336020 4975 scope.go:117] "RemoveContainer" containerID="71854f3b1963372fffda3a3ac1d6407bb648fb144065684587eaa5da21f9bece" Mar 18 12:52:09 crc kubenswrapper[4975]: I0318 12:52:09.345937 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pzg4x"] Mar 18 12:52:09 crc kubenswrapper[4975]: I0318 12:52:09.357271 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pzg4x"] Mar 18 12:52:11 crc kubenswrapper[4975]: I0318 12:52:11.034209 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c608e142-2e3d-4551-858b-771b250b3b12" path="/var/lib/kubelet/pods/c608e142-2e3d-4551-858b-771b250b3b12/volumes" Mar 18 12:52:19 crc kubenswrapper[4975]: I0318 12:52:19.017209 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:52:19 crc kubenswrapper[4975]: E0318 12:52:19.018144 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:52:30 crc kubenswrapper[4975]: I0318 12:52:30.016931 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:52:30 crc kubenswrapper[4975]: E0318 12:52:30.017922 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:52:42 crc kubenswrapper[4975]: I0318 12:52:42.016842 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:52:42 crc kubenswrapper[4975]: E0318 12:52:42.019179 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.111166 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hc2sz"] Mar 18 12:52:47 crc kubenswrapper[4975]: E0318 12:52:47.112584 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d350b93-c744-4449-bb20-35210bb3a1f2" containerName="oc" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.112609 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d350b93-c744-4449-bb20-35210bb3a1f2" containerName="oc" Mar 18 12:52:47 crc kubenswrapper[4975]: E0318 12:52:47.112634 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c608e142-2e3d-4551-858b-771b250b3b12" containerName="extract-content" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.112647 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c608e142-2e3d-4551-858b-771b250b3b12" containerName="extract-content" Mar 18 12:52:47 crc kubenswrapper[4975]: E0318 12:52:47.112674 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c608e142-2e3d-4551-858b-771b250b3b12" containerName="registry-server" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.112687 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c608e142-2e3d-4551-858b-771b250b3b12" containerName="registry-server" Mar 18 12:52:47 crc kubenswrapper[4975]: E0318 12:52:47.112729 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c608e142-2e3d-4551-858b-771b250b3b12" containerName="extract-utilities" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.112741 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c608e142-2e3d-4551-858b-771b250b3b12" containerName="extract-utilities" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.113225 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="c608e142-2e3d-4551-858b-771b250b3b12" containerName="registry-server" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.113273 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d350b93-c744-4449-bb20-35210bb3a1f2" containerName="oc" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.115660 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.125075 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hc2sz"] Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.184608 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldpf6\" (UniqueName: \"kubernetes.io/projected/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-kube-api-access-ldpf6\") pod \"redhat-operators-hc2sz\" (UID: \"f1008e81-b4e6-4c45-b0dc-0568e44a9df4\") " pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.184705 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-catalog-content\") pod \"redhat-operators-hc2sz\" (UID: \"f1008e81-b4e6-4c45-b0dc-0568e44a9df4\") " pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.184747 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-utilities\") pod \"redhat-operators-hc2sz\" (UID: \"f1008e81-b4e6-4c45-b0dc-0568e44a9df4\") " pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.286060 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldpf6\" (UniqueName: \"kubernetes.io/projected/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-kube-api-access-ldpf6\") pod \"redhat-operators-hc2sz\" (UID: \"f1008e81-b4e6-4c45-b0dc-0568e44a9df4\") " pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.286175 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-catalog-content\") pod \"redhat-operators-hc2sz\" (UID: \"f1008e81-b4e6-4c45-b0dc-0568e44a9df4\") " pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.286212 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-utilities\") pod \"redhat-operators-hc2sz\" (UID: \"f1008e81-b4e6-4c45-b0dc-0568e44a9df4\") " pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.286822 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-utilities\") pod \"redhat-operators-hc2sz\" (UID: \"f1008e81-b4e6-4c45-b0dc-0568e44a9df4\") " pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.286848 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-catalog-content\") pod \"redhat-operators-hc2sz\" (UID: \"f1008e81-b4e6-4c45-b0dc-0568e44a9df4\") " pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.306214 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldpf6\" (UniqueName: \"kubernetes.io/projected/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-kube-api-access-ldpf6\") pod \"redhat-operators-hc2sz\" (UID: \"f1008e81-b4e6-4c45-b0dc-0568e44a9df4\") " pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.483642 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.582074 4975 scope.go:117] "RemoveContainer" containerID="6a5f5008b343d134c1e07a951ef72843112e2d6a6f79b89508a2639e6d63f409" Mar 18 12:52:47 crc kubenswrapper[4975]: I0318 12:52:47.940890 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hc2sz"] Mar 18 12:52:48 crc kubenswrapper[4975]: I0318 12:52:48.688050 4975 generic.go:334] "Generic (PLEG): container finished" podID="f1008e81-b4e6-4c45-b0dc-0568e44a9df4" containerID="9533840aa4c14cf9b8807b8c5fdde57dbfa63b8a2930dc470e1394aefacb3acf" exitCode=0 Mar 18 12:52:48 crc kubenswrapper[4975]: I0318 12:52:48.688140 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hc2sz" event={"ID":"f1008e81-b4e6-4c45-b0dc-0568e44a9df4","Type":"ContainerDied","Data":"9533840aa4c14cf9b8807b8c5fdde57dbfa63b8a2930dc470e1394aefacb3acf"} Mar 18 12:52:48 crc kubenswrapper[4975]: I0318 12:52:48.688324 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hc2sz" event={"ID":"f1008e81-b4e6-4c45-b0dc-0568e44a9df4","Type":"ContainerStarted","Data":"d340a82f0b4b8d5bb1c1581c5509a53e99d64849d376b14980ac822982ff9fd0"} Mar 18 12:52:50 crc kubenswrapper[4975]: I0318 12:52:50.721200 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hc2sz" event={"ID":"f1008e81-b4e6-4c45-b0dc-0568e44a9df4","Type":"ContainerStarted","Data":"2ba0b8560732fde56244bf473839ac90c99fe0e3655a01c01cd2eb14752f2bcf"} Mar 18 12:52:51 crc kubenswrapper[4975]: I0318 12:52:51.733003 4975 generic.go:334] "Generic (PLEG): container finished" podID="f1008e81-b4e6-4c45-b0dc-0568e44a9df4" containerID="2ba0b8560732fde56244bf473839ac90c99fe0e3655a01c01cd2eb14752f2bcf" exitCode=0 Mar 18 12:52:51 crc kubenswrapper[4975]: I0318 12:52:51.733057 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hc2sz" event={"ID":"f1008e81-b4e6-4c45-b0dc-0568e44a9df4","Type":"ContainerDied","Data":"2ba0b8560732fde56244bf473839ac90c99fe0e3655a01c01cd2eb14752f2bcf"} Mar 18 12:52:53 crc kubenswrapper[4975]: I0318 12:52:53.752847 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hc2sz" event={"ID":"f1008e81-b4e6-4c45-b0dc-0568e44a9df4","Type":"ContainerStarted","Data":"2099fc6c4f82e63e790ffe3494f86dd8d40aaee58727298634e7e780ff18d097"} Mar 18 12:52:53 crc kubenswrapper[4975]: I0318 12:52:53.775508 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hc2sz" podStartSLOduration=2.21075311 podStartE2EDuration="6.775485571s" podCreationTimestamp="2026-03-18 12:52:47 +0000 UTC" firstStartedPulling="2026-03-18 12:52:48.689955088 +0000 UTC m=+2554.404355667" lastFinishedPulling="2026-03-18 12:52:53.254687549 +0000 UTC m=+2558.969088128" observedRunningTime="2026-03-18 12:52:53.773456656 +0000 UTC m=+2559.487857245" watchObservedRunningTime="2026-03-18 12:52:53.775485571 +0000 UTC m=+2559.489886150" Mar 18 12:52:54 crc kubenswrapper[4975]: I0318 12:52:54.016831 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:52:54 crc kubenswrapper[4975]: E0318 12:52:54.017185 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:52:57 crc kubenswrapper[4975]: I0318 12:52:57.483794 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:52:57 crc kubenswrapper[4975]: I0318 12:52:57.484533 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:52:58 crc kubenswrapper[4975]: I0318 12:52:58.527600 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hc2sz" podUID="f1008e81-b4e6-4c45-b0dc-0568e44a9df4" containerName="registry-server" probeResult="failure" output=< Mar 18 12:52:58 crc kubenswrapper[4975]: timeout: failed to connect service ":50051" within 1s Mar 18 12:52:58 crc kubenswrapper[4975]: > Mar 18 12:53:07 crc kubenswrapper[4975]: I0318 12:53:07.557384 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:53:07 crc kubenswrapper[4975]: I0318 12:53:07.635484 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:53:07 crc kubenswrapper[4975]: I0318 12:53:07.817374 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hc2sz"] Mar 18 12:53:08 crc kubenswrapper[4975]: I0318 12:53:08.016747 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:53:08 crc kubenswrapper[4975]: I0318 12:53:08.950501 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hc2sz" podUID="f1008e81-b4e6-4c45-b0dc-0568e44a9df4" containerName="registry-server" containerID="cri-o://2099fc6c4f82e63e790ffe3494f86dd8d40aaee58727298634e7e780ff18d097" gracePeriod=2 Mar 18 12:53:08 crc kubenswrapper[4975]: I0318 12:53:08.950935 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"ffa675b5d15e50ee5b5a9fe85b51627382b6e36a8ee201962ea64a4c8b81d353"} Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.489224 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.606773 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-catalog-content\") pod \"f1008e81-b4e6-4c45-b0dc-0568e44a9df4\" (UID: \"f1008e81-b4e6-4c45-b0dc-0568e44a9df4\") " Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.606982 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-utilities\") pod \"f1008e81-b4e6-4c45-b0dc-0568e44a9df4\" (UID: \"f1008e81-b4e6-4c45-b0dc-0568e44a9df4\") " Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.607275 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldpf6\" (UniqueName: \"kubernetes.io/projected/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-kube-api-access-ldpf6\") pod \"f1008e81-b4e6-4c45-b0dc-0568e44a9df4\" (UID: \"f1008e81-b4e6-4c45-b0dc-0568e44a9df4\") " Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.610026 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-utilities" (OuterVolumeSpecName: "utilities") pod "f1008e81-b4e6-4c45-b0dc-0568e44a9df4" (UID: "f1008e81-b4e6-4c45-b0dc-0568e44a9df4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.612420 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-kube-api-access-ldpf6" (OuterVolumeSpecName: "kube-api-access-ldpf6") pod "f1008e81-b4e6-4c45-b0dc-0568e44a9df4" (UID: "f1008e81-b4e6-4c45-b0dc-0568e44a9df4"). InnerVolumeSpecName "kube-api-access-ldpf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.711398 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.711481 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldpf6\" (UniqueName: \"kubernetes.io/projected/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-kube-api-access-ldpf6\") on node \"crc\" DevicePath \"\"" Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.742066 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1008e81-b4e6-4c45-b0dc-0568e44a9df4" (UID: "f1008e81-b4e6-4c45-b0dc-0568e44a9df4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.814081 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1008e81-b4e6-4c45-b0dc-0568e44a9df4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.963422 4975 generic.go:334] "Generic (PLEG): container finished" podID="f1008e81-b4e6-4c45-b0dc-0568e44a9df4" containerID="2099fc6c4f82e63e790ffe3494f86dd8d40aaee58727298634e7e780ff18d097" exitCode=0 Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.963562 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hc2sz" Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.964700 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hc2sz" event={"ID":"f1008e81-b4e6-4c45-b0dc-0568e44a9df4","Type":"ContainerDied","Data":"2099fc6c4f82e63e790ffe3494f86dd8d40aaee58727298634e7e780ff18d097"} Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.965112 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hc2sz" event={"ID":"f1008e81-b4e6-4c45-b0dc-0568e44a9df4","Type":"ContainerDied","Data":"d340a82f0b4b8d5bb1c1581c5509a53e99d64849d376b14980ac822982ff9fd0"} Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.965204 4975 scope.go:117] "RemoveContainer" containerID="2099fc6c4f82e63e790ffe3494f86dd8d40aaee58727298634e7e780ff18d097" Mar 18 12:53:09 crc kubenswrapper[4975]: I0318 12:53:09.992528 4975 scope.go:117] "RemoveContainer" containerID="2ba0b8560732fde56244bf473839ac90c99fe0e3655a01c01cd2eb14752f2bcf" Mar 18 12:53:10 crc kubenswrapper[4975]: I0318 12:53:10.018147 4975 scope.go:117] "RemoveContainer" containerID="9533840aa4c14cf9b8807b8c5fdde57dbfa63b8a2930dc470e1394aefacb3acf" Mar 18 12:53:10 crc kubenswrapper[4975]: I0318 12:53:10.083634 4975 scope.go:117] "RemoveContainer" containerID="2099fc6c4f82e63e790ffe3494f86dd8d40aaee58727298634e7e780ff18d097" Mar 18 12:53:10 crc kubenswrapper[4975]: E0318 12:53:10.090177 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2099fc6c4f82e63e790ffe3494f86dd8d40aaee58727298634e7e780ff18d097\": container with ID starting with 2099fc6c4f82e63e790ffe3494f86dd8d40aaee58727298634e7e780ff18d097 not found: ID does not exist" containerID="2099fc6c4f82e63e790ffe3494f86dd8d40aaee58727298634e7e780ff18d097" Mar 18 12:53:10 crc kubenswrapper[4975]: I0318 12:53:10.090225 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2099fc6c4f82e63e790ffe3494f86dd8d40aaee58727298634e7e780ff18d097"} err="failed to get container status \"2099fc6c4f82e63e790ffe3494f86dd8d40aaee58727298634e7e780ff18d097\": rpc error: code = NotFound desc = could not find container \"2099fc6c4f82e63e790ffe3494f86dd8d40aaee58727298634e7e780ff18d097\": container with ID starting with 2099fc6c4f82e63e790ffe3494f86dd8d40aaee58727298634e7e780ff18d097 not found: ID does not exist" Mar 18 12:53:10 crc kubenswrapper[4975]: I0318 12:53:10.090248 4975 scope.go:117] "RemoveContainer" containerID="2ba0b8560732fde56244bf473839ac90c99fe0e3655a01c01cd2eb14752f2bcf" Mar 18 12:53:10 crc kubenswrapper[4975]: E0318 12:53:10.091666 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba0b8560732fde56244bf473839ac90c99fe0e3655a01c01cd2eb14752f2bcf\": container with ID starting with 2ba0b8560732fde56244bf473839ac90c99fe0e3655a01c01cd2eb14752f2bcf not found: ID does not exist" containerID="2ba0b8560732fde56244bf473839ac90c99fe0e3655a01c01cd2eb14752f2bcf" Mar 18 12:53:10 crc kubenswrapper[4975]: I0318 12:53:10.091691 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba0b8560732fde56244bf473839ac90c99fe0e3655a01c01cd2eb14752f2bcf"} err="failed to get container status \"2ba0b8560732fde56244bf473839ac90c99fe0e3655a01c01cd2eb14752f2bcf\": rpc error: code = NotFound desc = could not find container \"2ba0b8560732fde56244bf473839ac90c99fe0e3655a01c01cd2eb14752f2bcf\": container with ID starting with 2ba0b8560732fde56244bf473839ac90c99fe0e3655a01c01cd2eb14752f2bcf not found: ID does not exist" Mar 18 12:53:10 crc kubenswrapper[4975]: I0318 12:53:10.091705 4975 scope.go:117] "RemoveContainer" containerID="9533840aa4c14cf9b8807b8c5fdde57dbfa63b8a2930dc470e1394aefacb3acf" Mar 18 12:53:10 crc kubenswrapper[4975]: E0318 12:53:10.092709 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9533840aa4c14cf9b8807b8c5fdde57dbfa63b8a2930dc470e1394aefacb3acf\": container with ID starting with 9533840aa4c14cf9b8807b8c5fdde57dbfa63b8a2930dc470e1394aefacb3acf not found: ID does not exist" containerID="9533840aa4c14cf9b8807b8c5fdde57dbfa63b8a2930dc470e1394aefacb3acf" Mar 18 12:53:10 crc kubenswrapper[4975]: I0318 12:53:10.092733 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9533840aa4c14cf9b8807b8c5fdde57dbfa63b8a2930dc470e1394aefacb3acf"} err="failed to get container status \"9533840aa4c14cf9b8807b8c5fdde57dbfa63b8a2930dc470e1394aefacb3acf\": rpc error: code = NotFound desc = could not find container \"9533840aa4c14cf9b8807b8c5fdde57dbfa63b8a2930dc470e1394aefacb3acf\": container with ID starting with 9533840aa4c14cf9b8807b8c5fdde57dbfa63b8a2930dc470e1394aefacb3acf not found: ID does not exist" Mar 18 12:53:10 crc kubenswrapper[4975]: I0318 12:53:10.094707 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hc2sz"] Mar 18 12:53:10 crc kubenswrapper[4975]: I0318 12:53:10.105276 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hc2sz"] Mar 18 12:53:11 crc kubenswrapper[4975]: I0318 12:53:11.035360 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1008e81-b4e6-4c45-b0dc-0568e44a9df4" path="/var/lib/kubelet/pods/f1008e81-b4e6-4c45-b0dc-0568e44a9df4/volumes" Mar 18 12:54:00 crc kubenswrapper[4975]: I0318 12:54:00.158099 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563974-qfqld"] Mar 18 12:54:00 crc kubenswrapper[4975]: E0318 12:54:00.159182 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1008e81-b4e6-4c45-b0dc-0568e44a9df4" containerName="registry-server" Mar 18 12:54:00 crc kubenswrapper[4975]: I0318 12:54:00.159198 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1008e81-b4e6-4c45-b0dc-0568e44a9df4" containerName="registry-server" Mar 18 12:54:00 crc kubenswrapper[4975]: E0318 12:54:00.159231 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1008e81-b4e6-4c45-b0dc-0568e44a9df4" containerName="extract-utilities" Mar 18 12:54:00 crc kubenswrapper[4975]: I0318 12:54:00.159309 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1008e81-b4e6-4c45-b0dc-0568e44a9df4" containerName="extract-utilities" Mar 18 12:54:00 crc kubenswrapper[4975]: E0318 12:54:00.159330 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1008e81-b4e6-4c45-b0dc-0568e44a9df4" containerName="extract-content" Mar 18 12:54:00 crc kubenswrapper[4975]: I0318 12:54:00.159336 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1008e81-b4e6-4c45-b0dc-0568e44a9df4" containerName="extract-content" Mar 18 12:54:00 crc kubenswrapper[4975]: I0318 12:54:00.159555 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1008e81-b4e6-4c45-b0dc-0568e44a9df4" containerName="registry-server" Mar 18 12:54:00 crc kubenswrapper[4975]: I0318 12:54:00.160270 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563974-qfqld" Mar 18 12:54:00 crc kubenswrapper[4975]: I0318 12:54:00.162547 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:54:00 crc kubenswrapper[4975]: I0318 12:54:00.162746 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:54:00 crc kubenswrapper[4975]: I0318 12:54:00.163124 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:54:00 crc kubenswrapper[4975]: I0318 12:54:00.174427 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563974-qfqld"] Mar 18 12:54:00 crc kubenswrapper[4975]: I0318 12:54:00.192360 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqdgq\" (UniqueName: \"kubernetes.io/projected/9d1b19c8-648a-4152-937f-93b1a5aea846-kube-api-access-cqdgq\") pod \"auto-csr-approver-29563974-qfqld\" (UID: \"9d1b19c8-648a-4152-937f-93b1a5aea846\") " pod="openshift-infra/auto-csr-approver-29563974-qfqld" Mar 18 12:54:00 crc kubenswrapper[4975]: I0318 12:54:00.293653 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqdgq\" (UniqueName: \"kubernetes.io/projected/9d1b19c8-648a-4152-937f-93b1a5aea846-kube-api-access-cqdgq\") pod \"auto-csr-approver-29563974-qfqld\" (UID: \"9d1b19c8-648a-4152-937f-93b1a5aea846\") " pod="openshift-infra/auto-csr-approver-29563974-qfqld" Mar 18 12:54:00 crc kubenswrapper[4975]: I0318 12:54:00.323621 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqdgq\" (UniqueName: \"kubernetes.io/projected/9d1b19c8-648a-4152-937f-93b1a5aea846-kube-api-access-cqdgq\") pod \"auto-csr-approver-29563974-qfqld\" (UID: \"9d1b19c8-648a-4152-937f-93b1a5aea846\") " pod="openshift-infra/auto-csr-approver-29563974-qfqld" Mar 18 12:54:00 crc kubenswrapper[4975]: I0318 12:54:00.494383 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563974-qfqld" Mar 18 12:54:00 crc kubenswrapper[4975]: I0318 12:54:00.914212 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563974-qfqld"] Mar 18 12:54:01 crc kubenswrapper[4975]: I0318 12:54:01.456500 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563974-qfqld" event={"ID":"9d1b19c8-648a-4152-937f-93b1a5aea846","Type":"ContainerStarted","Data":"c115d51a6c4aa527c4937aa942fbddbc6b6c91b4a2e8456343da38999d78985f"} Mar 18 12:54:03 crc kubenswrapper[4975]: I0318 12:54:03.477899 4975 generic.go:334] "Generic (PLEG): container finished" podID="9d1b19c8-648a-4152-937f-93b1a5aea846" containerID="c17f49e77a8b084025194c527f869359d4e5e79b9fe21442370a2f004d4e9d6f" exitCode=0 Mar 18 12:54:03 crc kubenswrapper[4975]: I0318 12:54:03.478012 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563974-qfqld" event={"ID":"9d1b19c8-648a-4152-937f-93b1a5aea846","Type":"ContainerDied","Data":"c17f49e77a8b084025194c527f869359d4e5e79b9fe21442370a2f004d4e9d6f"} Mar 18 12:54:04 crc kubenswrapper[4975]: I0318 12:54:04.905325 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563974-qfqld" Mar 18 12:54:05 crc kubenswrapper[4975]: I0318 12:54:05.005253 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqdgq\" (UniqueName: \"kubernetes.io/projected/9d1b19c8-648a-4152-937f-93b1a5aea846-kube-api-access-cqdgq\") pod \"9d1b19c8-648a-4152-937f-93b1a5aea846\" (UID: \"9d1b19c8-648a-4152-937f-93b1a5aea846\") " Mar 18 12:54:05 crc kubenswrapper[4975]: I0318 12:54:05.015447 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d1b19c8-648a-4152-937f-93b1a5aea846-kube-api-access-cqdgq" (OuterVolumeSpecName: "kube-api-access-cqdgq") pod "9d1b19c8-648a-4152-937f-93b1a5aea846" (UID: "9d1b19c8-648a-4152-937f-93b1a5aea846"). InnerVolumeSpecName "kube-api-access-cqdgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:54:05 crc kubenswrapper[4975]: I0318 12:54:05.107679 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqdgq\" (UniqueName: \"kubernetes.io/projected/9d1b19c8-648a-4152-937f-93b1a5aea846-kube-api-access-cqdgq\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:05 crc kubenswrapper[4975]: I0318 12:54:05.498052 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563974-qfqld" event={"ID":"9d1b19c8-648a-4152-937f-93b1a5aea846","Type":"ContainerDied","Data":"c115d51a6c4aa527c4937aa942fbddbc6b6c91b4a2e8456343da38999d78985f"} Mar 18 12:54:05 crc kubenswrapper[4975]: I0318 12:54:05.498093 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c115d51a6c4aa527c4937aa942fbddbc6b6c91b4a2e8456343da38999d78985f" Mar 18 12:54:05 crc kubenswrapper[4975]: I0318 12:54:05.498189 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563974-qfqld" Mar 18 12:54:05 crc kubenswrapper[4975]: I0318 12:54:05.966374 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563968-jzvj4"] Mar 18 12:54:05 crc kubenswrapper[4975]: I0318 12:54:05.974332 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563968-jzvj4"] Mar 18 12:54:07 crc kubenswrapper[4975]: I0318 12:54:07.028253 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f0cc91-cbf4-40dd-aef8-e75eac7fe944" path="/var/lib/kubelet/pods/d0f0cc91-cbf4-40dd-aef8-e75eac7fe944/volumes" Mar 18 12:54:40 crc kubenswrapper[4975]: I0318 12:54:40.853431 4975 generic.go:334] "Generic (PLEG): container finished" podID="63653681-d7c8-4201-9922-32bfe87bc28f" containerID="7f032cb30f7daea23026400fd850a94e5c2738e737868b4cfced42f612c6eeb3" exitCode=2 Mar 18 12:54:40 crc kubenswrapper[4975]: I0318 12:54:40.853563 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" event={"ID":"63653681-d7c8-4201-9922-32bfe87bc28f","Type":"ContainerDied","Data":"7f032cb30f7daea23026400fd850a94e5c2738e737868b4cfced42f612c6eeb3"} Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.261949 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.358242 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-inventory\") pod \"63653681-d7c8-4201-9922-32bfe87bc28f\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.358585 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-libvirt-secret-0\") pod \"63653681-d7c8-4201-9922-32bfe87bc28f\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.358681 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-ssh-key-openstack-edpm-ipam\") pod \"63653681-d7c8-4201-9922-32bfe87bc28f\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.359662 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-libvirt-combined-ca-bundle\") pod \"63653681-d7c8-4201-9922-32bfe87bc28f\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.359890 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h42lb\" (UniqueName: \"kubernetes.io/projected/63653681-d7c8-4201-9922-32bfe87bc28f-kube-api-access-h42lb\") pod \"63653681-d7c8-4201-9922-32bfe87bc28f\" (UID: \"63653681-d7c8-4201-9922-32bfe87bc28f\") " Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.364155 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63653681-d7c8-4201-9922-32bfe87bc28f-kube-api-access-h42lb" (OuterVolumeSpecName: "kube-api-access-h42lb") pod "63653681-d7c8-4201-9922-32bfe87bc28f" (UID: "63653681-d7c8-4201-9922-32bfe87bc28f"). InnerVolumeSpecName "kube-api-access-h42lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.364405 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "63653681-d7c8-4201-9922-32bfe87bc28f" (UID: "63653681-d7c8-4201-9922-32bfe87bc28f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.395549 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "63653681-d7c8-4201-9922-32bfe87bc28f" (UID: "63653681-d7c8-4201-9922-32bfe87bc28f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.396551 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "63653681-d7c8-4201-9922-32bfe87bc28f" (UID: "63653681-d7c8-4201-9922-32bfe87bc28f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.399805 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-inventory" (OuterVolumeSpecName: "inventory") pod "63653681-d7c8-4201-9922-32bfe87bc28f" (UID: "63653681-d7c8-4201-9922-32bfe87bc28f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.461479 4975 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.461511 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h42lb\" (UniqueName: \"kubernetes.io/projected/63653681-d7c8-4201-9922-32bfe87bc28f-kube-api-access-h42lb\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.461521 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.461529 4975 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.461539 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63653681-d7c8-4201-9922-32bfe87bc28f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.876502 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" event={"ID":"63653681-d7c8-4201-9922-32bfe87bc28f","Type":"ContainerDied","Data":"2ef94e2e3a6e079e7b0e5e430d8e5d67eba646112edb16eb4c1b449284b61d3f"} Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.876560 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ef94e2e3a6e079e7b0e5e430d8e5d67eba646112edb16eb4c1b449284b61d3f" Mar 18 12:54:42 crc kubenswrapper[4975]: I0318 12:54:42.876645 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx" Mar 18 12:54:47 crc kubenswrapper[4975]: I0318 12:54:47.764541 4975 scope.go:117] "RemoveContainer" containerID="f51893106ca186c5877fb52ab49d03caa19bc36ffb0ac3f34440efab822c6e40" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.034090 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9"] Mar 18 12:54:49 crc kubenswrapper[4975]: E0318 12:54:49.034942 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63653681-d7c8-4201-9922-32bfe87bc28f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.034962 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="63653681-d7c8-4201-9922-32bfe87bc28f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:54:49 crc kubenswrapper[4975]: E0318 12:54:49.035008 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1b19c8-648a-4152-937f-93b1a5aea846" containerName="oc" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.035017 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1b19c8-648a-4152-937f-93b1a5aea846" containerName="oc" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.035231 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d1b19c8-648a-4152-937f-93b1a5aea846" containerName="oc" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.035249 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="63653681-d7c8-4201-9922-32bfe87bc28f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.036043 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.038835 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.039027 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.039223 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.041526 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.099998 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.115125 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9"] Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.206984 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.207051 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.207083 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.207160 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.207185 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfvds\" (UniqueName: \"kubernetes.io/projected/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-kube-api-access-tfvds\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.308594 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.308683 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.308732 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.308950 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.309107 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfvds\" (UniqueName: \"kubernetes.io/projected/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-kube-api-access-tfvds\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.315249 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.315464 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.316023 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.317020 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.325353 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfvds\" (UniqueName: \"kubernetes.io/projected/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-kube-api-access-tfvds\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:49 crc kubenswrapper[4975]: I0318 12:54:49.411784 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:54:50 crc kubenswrapper[4975]: I0318 12:54:50.047653 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9"] Mar 18 12:54:50 crc kubenswrapper[4975]: I0318 12:54:50.952830 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" event={"ID":"06a6be48-3e59-499f-b3aa-f0a6f9bbe812","Type":"ContainerStarted","Data":"4d9844a2fd7b0cf4d88afde4c3ae3edf8df5b64f76ef24c061c73da91d551220"} Mar 18 12:54:51 crc kubenswrapper[4975]: I0318 12:54:51.972911 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" event={"ID":"06a6be48-3e59-499f-b3aa-f0a6f9bbe812","Type":"ContainerStarted","Data":"eefe0e4fc8bd21a8b1842de312a074e740549893b41de3c3d34c086c58115275"} Mar 18 12:54:51 crc kubenswrapper[4975]: I0318 12:54:51.994734 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" podStartSLOduration=1.9744567 podStartE2EDuration="2.994698812s" podCreationTimestamp="2026-03-18 12:54:49 +0000 UTC" firstStartedPulling="2026-03-18 12:54:50.047429586 +0000 UTC m=+2675.761830165" lastFinishedPulling="2026-03-18 12:54:51.067671698 +0000 UTC m=+2676.782072277" observedRunningTime="2026-03-18 12:54:51.989148601 +0000 UTC m=+2677.703549190" watchObservedRunningTime="2026-03-18 12:54:51.994698812 +0000 UTC m=+2677.709099391" Mar 18 12:55:25 crc kubenswrapper[4975]: I0318 12:55:25.538959 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:55:25 crc kubenswrapper[4975]: I0318 12:55:25.539551 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:55:55 crc kubenswrapper[4975]: I0318 12:55:55.539238 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:55:55 crc kubenswrapper[4975]: I0318 12:55:55.539814 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:56:00 crc kubenswrapper[4975]: I0318 12:56:00.152268 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563976-2dbcd"] Mar 18 12:56:00 crc kubenswrapper[4975]: I0318 12:56:00.154143 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563976-2dbcd" Mar 18 12:56:00 crc kubenswrapper[4975]: I0318 12:56:00.158342 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:56:00 crc kubenswrapper[4975]: I0318 12:56:00.158452 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:56:00 crc kubenswrapper[4975]: I0318 12:56:00.160833 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:56:00 crc kubenswrapper[4975]: I0318 12:56:00.166745 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563976-2dbcd"] Mar 18 12:56:00 crc kubenswrapper[4975]: I0318 12:56:00.249995 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6x5x\" (UniqueName: \"kubernetes.io/projected/ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea-kube-api-access-p6x5x\") pod \"auto-csr-approver-29563976-2dbcd\" (UID: \"ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea\") " pod="openshift-infra/auto-csr-approver-29563976-2dbcd" Mar 18 12:56:00 crc kubenswrapper[4975]: I0318 12:56:00.352148 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6x5x\" (UniqueName: \"kubernetes.io/projected/ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea-kube-api-access-p6x5x\") pod \"auto-csr-approver-29563976-2dbcd\" (UID: \"ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea\") " pod="openshift-infra/auto-csr-approver-29563976-2dbcd" Mar 18 12:56:00 crc kubenswrapper[4975]: I0318 12:56:00.369608 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6x5x\" (UniqueName: \"kubernetes.io/projected/ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea-kube-api-access-p6x5x\") pod \"auto-csr-approver-29563976-2dbcd\" (UID: \"ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea\") " pod="openshift-infra/auto-csr-approver-29563976-2dbcd" Mar 18 12:56:00 crc kubenswrapper[4975]: I0318 12:56:00.475513 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563976-2dbcd" Mar 18 12:56:00 crc kubenswrapper[4975]: I0318 12:56:00.964606 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563976-2dbcd"] Mar 18 12:56:01 crc kubenswrapper[4975]: I0318 12:56:01.635530 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563976-2dbcd" event={"ID":"ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea","Type":"ContainerStarted","Data":"3776ed626b067658a64897b8d73140d5898de0c4090a45d203f2bc27083879b7"} Mar 18 12:56:02 crc kubenswrapper[4975]: I0318 12:56:02.644307 4975 generic.go:334] "Generic (PLEG): container finished" podID="ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea" containerID="6105825e8f538b7cd6cb7329ba3b6bf68588699dd26992d82765640f2071d46f" exitCode=0 Mar 18 12:56:02 crc kubenswrapper[4975]: I0318 12:56:02.644377 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563976-2dbcd" event={"ID":"ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea","Type":"ContainerDied","Data":"6105825e8f538b7cd6cb7329ba3b6bf68588699dd26992d82765640f2071d46f"} Mar 18 12:56:03 crc kubenswrapper[4975]: I0318 12:56:03.952974 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563976-2dbcd" Mar 18 12:56:04 crc kubenswrapper[4975]: I0318 12:56:04.129665 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6x5x\" (UniqueName: \"kubernetes.io/projected/ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea-kube-api-access-p6x5x\") pod \"ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea\" (UID: \"ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea\") " Mar 18 12:56:04 crc kubenswrapper[4975]: I0318 12:56:04.136444 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea-kube-api-access-p6x5x" (OuterVolumeSpecName: "kube-api-access-p6x5x") pod "ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea" (UID: "ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea"). InnerVolumeSpecName "kube-api-access-p6x5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:56:04 crc kubenswrapper[4975]: I0318 12:56:04.231956 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6x5x\" (UniqueName: \"kubernetes.io/projected/ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea-kube-api-access-p6x5x\") on node \"crc\" DevicePath \"\"" Mar 18 12:56:04 crc kubenswrapper[4975]: I0318 12:56:04.665240 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563976-2dbcd" event={"ID":"ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea","Type":"ContainerDied","Data":"3776ed626b067658a64897b8d73140d5898de0c4090a45d203f2bc27083879b7"} Mar 18 12:56:04 crc kubenswrapper[4975]: I0318 12:56:04.665318 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3776ed626b067658a64897b8d73140d5898de0c4090a45d203f2bc27083879b7" Mar 18 12:56:04 crc kubenswrapper[4975]: I0318 12:56:04.665335 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563976-2dbcd" Mar 18 12:56:05 crc kubenswrapper[4975]: I0318 12:56:05.041511 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563970-c77tz"] Mar 18 12:56:05 crc kubenswrapper[4975]: I0318 12:56:05.051628 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563970-c77tz"] Mar 18 12:56:07 crc kubenswrapper[4975]: I0318 12:56:07.043944 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d98388e-202d-4034-bd27-b8e4b4ebfebc" path="/var/lib/kubelet/pods/3d98388e-202d-4034-bd27-b8e4b4ebfebc/volumes" Mar 18 12:56:25 crc kubenswrapper[4975]: I0318 12:56:25.539030 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:56:25 crc kubenswrapper[4975]: I0318 12:56:25.539734 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:56:25 crc kubenswrapper[4975]: I0318 12:56:25.539803 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:56:25 crc kubenswrapper[4975]: I0318 12:56:25.540384 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ffa675b5d15e50ee5b5a9fe85b51627382b6e36a8ee201962ea64a4c8b81d353"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:56:25 crc kubenswrapper[4975]: I0318 12:56:25.540458 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://ffa675b5d15e50ee5b5a9fe85b51627382b6e36a8ee201962ea64a4c8b81d353" gracePeriod=600 Mar 18 12:56:25 crc kubenswrapper[4975]: I0318 12:56:25.892167 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="ffa675b5d15e50ee5b5a9fe85b51627382b6e36a8ee201962ea64a4c8b81d353" exitCode=0 Mar 18 12:56:25 crc kubenswrapper[4975]: I0318 12:56:25.892267 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"ffa675b5d15e50ee5b5a9fe85b51627382b6e36a8ee201962ea64a4c8b81d353"} Mar 18 12:56:25 crc kubenswrapper[4975]: I0318 12:56:25.892439 4975 scope.go:117] "RemoveContainer" containerID="282c0d2edce48a696b8f2f552d9912c1b1cb20ec612f4adca385469f9794e2d3" Mar 18 12:56:26 crc kubenswrapper[4975]: I0318 12:56:26.904056 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46"} Mar 18 12:56:37 crc kubenswrapper[4975]: I0318 12:56:37.002567 4975 generic.go:334] "Generic (PLEG): container finished" podID="06a6be48-3e59-499f-b3aa-f0a6f9bbe812" containerID="eefe0e4fc8bd21a8b1842de312a074e740549893b41de3c3d34c086c58115275" exitCode=2 Mar 18 12:56:37 crc kubenswrapper[4975]: I0318 12:56:37.002640 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" event={"ID":"06a6be48-3e59-499f-b3aa-f0a6f9bbe812","Type":"ContainerDied","Data":"eefe0e4fc8bd21a8b1842de312a074e740549893b41de3c3d34c086c58115275"} Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.224267 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.325820 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-ssh-key-openstack-edpm-ipam\") pod \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.325984 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-inventory\") pod \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.326111 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfvds\" (UniqueName: \"kubernetes.io/projected/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-kube-api-access-tfvds\") pod \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.326189 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-libvirt-secret-0\") pod \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.326229 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-libvirt-combined-ca-bundle\") pod \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\" (UID: \"06a6be48-3e59-499f-b3aa-f0a6f9bbe812\") " Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.332898 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-kube-api-access-tfvds" (OuterVolumeSpecName: "kube-api-access-tfvds") pod "06a6be48-3e59-499f-b3aa-f0a6f9bbe812" (UID: "06a6be48-3e59-499f-b3aa-f0a6f9bbe812"). InnerVolumeSpecName "kube-api-access-tfvds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.334124 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "06a6be48-3e59-499f-b3aa-f0a6f9bbe812" (UID: "06a6be48-3e59-499f-b3aa-f0a6f9bbe812"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.356753 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-inventory" (OuterVolumeSpecName: "inventory") pod "06a6be48-3e59-499f-b3aa-f0a6f9bbe812" (UID: "06a6be48-3e59-499f-b3aa-f0a6f9bbe812"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.359473 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "06a6be48-3e59-499f-b3aa-f0a6f9bbe812" (UID: "06a6be48-3e59-499f-b3aa-f0a6f9bbe812"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.376080 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "06a6be48-3e59-499f-b3aa-f0a6f9bbe812" (UID: "06a6be48-3e59-499f-b3aa-f0a6f9bbe812"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.429184 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.429218 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfvds\" (UniqueName: \"kubernetes.io/projected/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-kube-api-access-tfvds\") on node \"crc\" DevicePath \"\"" Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.429229 4975 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.429239 4975 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:56:40 crc kubenswrapper[4975]: I0318 12:56:40.429248 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/06a6be48-3e59-499f-b3aa-f0a6f9bbe812-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:56:41 crc kubenswrapper[4975]: I0318 12:56:41.036594 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" event={"ID":"06a6be48-3e59-499f-b3aa-f0a6f9bbe812","Type":"ContainerDied","Data":"4d9844a2fd7b0cf4d88afde4c3ae3edf8df5b64f76ef24c061c73da91d551220"} Mar 18 12:56:41 crc kubenswrapper[4975]: I0318 12:56:41.036910 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d9844a2fd7b0cf4d88afde4c3ae3edf8df5b64f76ef24c061c73da91d551220" Mar 18 12:56:41 crc kubenswrapper[4975]: I0318 12:56:41.036700 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9" Mar 18 12:56:47 crc kubenswrapper[4975]: I0318 12:56:47.879217 4975 scope.go:117] "RemoveContainer" containerID="b4eacdbd7bacb7dc4379f35d520bb5e7d9c4da8bb2ad5589744766acb18c7e36" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.032036 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc"] Mar 18 12:56:56 crc kubenswrapper[4975]: E0318 12:56:56.033076 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a6be48-3e59-499f-b3aa-f0a6f9bbe812" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.033096 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a6be48-3e59-499f-b3aa-f0a6f9bbe812" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:56:56 crc kubenswrapper[4975]: E0318 12:56:56.033117 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea" containerName="oc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.033125 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea" containerName="oc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.033372 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="06a6be48-3e59-499f-b3aa-f0a6f9bbe812" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.033400 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea" containerName="oc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.034221 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.045293 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc"] Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.046587 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.046809 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.046861 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.046955 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.047067 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.220777 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.220908 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.221064 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp9zj\" (UniqueName: \"kubernetes.io/projected/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-kube-api-access-tp9zj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.221163 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.221534 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.323296 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.323343 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.323389 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.323419 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp9zj\" (UniqueName: \"kubernetes.io/projected/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-kube-api-access-tp9zj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.323453 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.329530 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.329905 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.330334 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.330667 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.340081 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp9zj\" (UniqueName: \"kubernetes.io/projected/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-kube-api-access-tp9zj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.427227 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.942446 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc"] Mar 18 12:56:56 crc kubenswrapper[4975]: W0318 12:56:56.952991 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1c9f3d9_ff18_40c6_81b4_9eed81219d55.slice/crio-6707026450fc3a302619ab9dae40dacba6883e4ebe43aea61ab55445ebdea978 WatchSource:0}: Error finding container 6707026450fc3a302619ab9dae40dacba6883e4ebe43aea61ab55445ebdea978: Status 404 returned error can't find the container with id 6707026450fc3a302619ab9dae40dacba6883e4ebe43aea61ab55445ebdea978 Mar 18 12:56:56 crc kubenswrapper[4975]: I0318 12:56:56.954807 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 12:56:57 crc kubenswrapper[4975]: I0318 12:56:57.204238 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" event={"ID":"b1c9f3d9-ff18-40c6-81b4-9eed81219d55","Type":"ContainerStarted","Data":"6707026450fc3a302619ab9dae40dacba6883e4ebe43aea61ab55445ebdea978"} Mar 18 12:56:58 crc kubenswrapper[4975]: I0318 12:56:58.213117 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" event={"ID":"b1c9f3d9-ff18-40c6-81b4-9eed81219d55","Type":"ContainerStarted","Data":"0b95601e858b20ea4cfd8b4ddc27cf2b94af595dc3b5c170894c3fafc9e0b197"} Mar 18 12:58:00 crc kubenswrapper[4975]: I0318 12:58:00.169968 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" podStartSLOduration=63.990799647 podStartE2EDuration="1m4.169938005s" podCreationTimestamp="2026-03-18 12:56:56 +0000 UTC" firstStartedPulling="2026-03-18 12:56:56.954532304 +0000 UTC m=+2802.668932883" lastFinishedPulling="2026-03-18 12:56:57.133670662 +0000 UTC m=+2802.848071241" observedRunningTime="2026-03-18 12:56:58.232838293 +0000 UTC m=+2803.947238892" watchObservedRunningTime="2026-03-18 12:58:00.169938005 +0000 UTC m=+2865.884338584" Mar 18 12:58:00 crc kubenswrapper[4975]: I0318 12:58:00.174709 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563978-5g5qb"] Mar 18 12:58:00 crc kubenswrapper[4975]: I0318 12:58:00.177661 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563978-5g5qb" Mar 18 12:58:00 crc kubenswrapper[4975]: I0318 12:58:00.180753 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 12:58:00 crc kubenswrapper[4975]: I0318 12:58:00.180755 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 12:58:00 crc kubenswrapper[4975]: I0318 12:58:00.181465 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 12:58:00 crc kubenswrapper[4975]: I0318 12:58:00.187608 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563978-5g5qb"] Mar 18 12:58:00 crc kubenswrapper[4975]: I0318 12:58:00.318946 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw76t\" (UniqueName: \"kubernetes.io/projected/5e9aebc1-ef2c-497e-97e1-793bdbacd425-kube-api-access-lw76t\") pod \"auto-csr-approver-29563978-5g5qb\" (UID: \"5e9aebc1-ef2c-497e-97e1-793bdbacd425\") " pod="openshift-infra/auto-csr-approver-29563978-5g5qb" Mar 18 12:58:00 crc kubenswrapper[4975]: I0318 12:58:00.420748 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw76t\" (UniqueName: \"kubernetes.io/projected/5e9aebc1-ef2c-497e-97e1-793bdbacd425-kube-api-access-lw76t\") pod \"auto-csr-approver-29563978-5g5qb\" (UID: \"5e9aebc1-ef2c-497e-97e1-793bdbacd425\") " pod="openshift-infra/auto-csr-approver-29563978-5g5qb" Mar 18 12:58:00 crc kubenswrapper[4975]: I0318 12:58:00.439844 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw76t\" (UniqueName: \"kubernetes.io/projected/5e9aebc1-ef2c-497e-97e1-793bdbacd425-kube-api-access-lw76t\") pod \"auto-csr-approver-29563978-5g5qb\" (UID: \"5e9aebc1-ef2c-497e-97e1-793bdbacd425\") " pod="openshift-infra/auto-csr-approver-29563978-5g5qb" Mar 18 12:58:00 crc kubenswrapper[4975]: I0318 12:58:00.502091 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563978-5g5qb" Mar 18 12:58:00 crc kubenswrapper[4975]: I0318 12:58:00.955804 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563978-5g5qb"] Mar 18 12:58:01 crc kubenswrapper[4975]: I0318 12:58:01.838260 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563978-5g5qb" event={"ID":"5e9aebc1-ef2c-497e-97e1-793bdbacd425","Type":"ContainerStarted","Data":"f6cca3c05530b768fdb7a97bb481411014aeef0f998165d122cc21c1a107533e"} Mar 18 12:58:02 crc kubenswrapper[4975]: I0318 12:58:02.848439 4975 generic.go:334] "Generic (PLEG): container finished" podID="5e9aebc1-ef2c-497e-97e1-793bdbacd425" containerID="bdb336c5a5a94a634be016759623a8a39151e69be61cf4e069d329bcc93d7fc9" exitCode=0 Mar 18 12:58:02 crc kubenswrapper[4975]: I0318 12:58:02.848493 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563978-5g5qb" event={"ID":"5e9aebc1-ef2c-497e-97e1-793bdbacd425","Type":"ContainerDied","Data":"bdb336c5a5a94a634be016759623a8a39151e69be61cf4e069d329bcc93d7fc9"} Mar 18 12:58:04 crc kubenswrapper[4975]: I0318 12:58:04.299057 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563978-5g5qb" Mar 18 12:58:04 crc kubenswrapper[4975]: I0318 12:58:04.422797 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw76t\" (UniqueName: \"kubernetes.io/projected/5e9aebc1-ef2c-497e-97e1-793bdbacd425-kube-api-access-lw76t\") pod \"5e9aebc1-ef2c-497e-97e1-793bdbacd425\" (UID: \"5e9aebc1-ef2c-497e-97e1-793bdbacd425\") " Mar 18 12:58:04 crc kubenswrapper[4975]: I0318 12:58:04.434505 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9aebc1-ef2c-497e-97e1-793bdbacd425-kube-api-access-lw76t" (OuterVolumeSpecName: "kube-api-access-lw76t") pod "5e9aebc1-ef2c-497e-97e1-793bdbacd425" (UID: "5e9aebc1-ef2c-497e-97e1-793bdbacd425"). InnerVolumeSpecName "kube-api-access-lw76t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:58:04 crc kubenswrapper[4975]: I0318 12:58:04.526518 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw76t\" (UniqueName: \"kubernetes.io/projected/5e9aebc1-ef2c-497e-97e1-793bdbacd425-kube-api-access-lw76t\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:04 crc kubenswrapper[4975]: I0318 12:58:04.874762 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563978-5g5qb" event={"ID":"5e9aebc1-ef2c-497e-97e1-793bdbacd425","Type":"ContainerDied","Data":"f6cca3c05530b768fdb7a97bb481411014aeef0f998165d122cc21c1a107533e"} Mar 18 12:58:04 crc kubenswrapper[4975]: I0318 12:58:04.874837 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6cca3c05530b768fdb7a97bb481411014aeef0f998165d122cc21c1a107533e" Mar 18 12:58:04 crc kubenswrapper[4975]: I0318 12:58:04.874947 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563978-5g5qb" Mar 18 12:58:05 crc kubenswrapper[4975]: I0318 12:58:05.384939 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563972-s4w58"] Mar 18 12:58:05 crc kubenswrapper[4975]: I0318 12:58:05.395250 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563972-s4w58"] Mar 18 12:58:07 crc kubenswrapper[4975]: I0318 12:58:07.026954 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d350b93-c744-4449-bb20-35210bb3a1f2" path="/var/lib/kubelet/pods/9d350b93-c744-4449-bb20-35210bb3a1f2/volumes" Mar 18 12:58:25 crc kubenswrapper[4975]: I0318 12:58:25.539130 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:58:25 crc kubenswrapper[4975]: I0318 12:58:25.539802 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:58:38 crc kubenswrapper[4975]: I0318 12:58:38.227714 4975 generic.go:334] "Generic (PLEG): container finished" podID="b1c9f3d9-ff18-40c6-81b4-9eed81219d55" containerID="0b95601e858b20ea4cfd8b4ddc27cf2b94af595dc3b5c170894c3fafc9e0b197" exitCode=2 Mar 18 12:58:38 crc kubenswrapper[4975]: I0318 12:58:38.227816 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" event={"ID":"b1c9f3d9-ff18-40c6-81b4-9eed81219d55","Type":"ContainerDied","Data":"0b95601e858b20ea4cfd8b4ddc27cf2b94af595dc3b5c170894c3fafc9e0b197"} Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.657369 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.849788 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-ssh-key-openstack-edpm-ipam\") pod \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.849945 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp9zj\" (UniqueName: \"kubernetes.io/projected/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-kube-api-access-tp9zj\") pod \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.850049 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-libvirt-combined-ca-bundle\") pod \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.850161 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-inventory\") pod \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.850192 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-libvirt-secret-0\") pod \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\" (UID: \"b1c9f3d9-ff18-40c6-81b4-9eed81219d55\") " Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.856512 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b1c9f3d9-ff18-40c6-81b4-9eed81219d55" (UID: "b1c9f3d9-ff18-40c6-81b4-9eed81219d55"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.865156 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-kube-api-access-tp9zj" (OuterVolumeSpecName: "kube-api-access-tp9zj") pod "b1c9f3d9-ff18-40c6-81b4-9eed81219d55" (UID: "b1c9f3d9-ff18-40c6-81b4-9eed81219d55"). InnerVolumeSpecName "kube-api-access-tp9zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.884262 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-inventory" (OuterVolumeSpecName: "inventory") pod "b1c9f3d9-ff18-40c6-81b4-9eed81219d55" (UID: "b1c9f3d9-ff18-40c6-81b4-9eed81219d55"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.886212 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b1c9f3d9-ff18-40c6-81b4-9eed81219d55" (UID: "b1c9f3d9-ff18-40c6-81b4-9eed81219d55"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.886582 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b1c9f3d9-ff18-40c6-81b4-9eed81219d55" (UID: "b1c9f3d9-ff18-40c6-81b4-9eed81219d55"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.952764 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp9zj\" (UniqueName: \"kubernetes.io/projected/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-kube-api-access-tp9zj\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.952806 4975 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.952820 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.952836 4975 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:39 crc kubenswrapper[4975]: I0318 12:58:39.952852 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1c9f3d9-ff18-40c6-81b4-9eed81219d55-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 12:58:40 crc kubenswrapper[4975]: I0318 12:58:40.251319 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" event={"ID":"b1c9f3d9-ff18-40c6-81b4-9eed81219d55","Type":"ContainerDied","Data":"6707026450fc3a302619ab9dae40dacba6883e4ebe43aea61ab55445ebdea978"} Mar 18 12:58:40 crc kubenswrapper[4975]: I0318 12:58:40.251365 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6707026450fc3a302619ab9dae40dacba6883e4ebe43aea61ab55445ebdea978" Mar 18 12:58:40 crc kubenswrapper[4975]: I0318 12:58:40.251393 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc" Mar 18 12:58:48 crc kubenswrapper[4975]: I0318 12:58:48.002349 4975 scope.go:117] "RemoveContainer" containerID="5ce72b4fd6712fc1dddf3ba25c70fa2c32076464941898322f14dfb1a7696560" Mar 18 12:58:55 crc kubenswrapper[4975]: I0318 12:58:55.538690 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:58:55 crc kubenswrapper[4975]: I0318 12:58:55.539443 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.041536 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29"] Mar 18 12:59:17 crc kubenswrapper[4975]: E0318 12:59:17.042662 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c9f3d9-ff18-40c6-81b4-9eed81219d55" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.042681 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c9f3d9-ff18-40c6-81b4-9eed81219d55" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:59:17 crc kubenswrapper[4975]: E0318 12:59:17.042693 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9aebc1-ef2c-497e-97e1-793bdbacd425" containerName="oc" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.042700 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9aebc1-ef2c-497e-97e1-793bdbacd425" containerName="oc" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.042952 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1c9f3d9-ff18-40c6-81b4-9eed81219d55" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.042981 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9aebc1-ef2c-497e-97e1-793bdbacd425" containerName="oc" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.043806 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.046103 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.046152 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.047740 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.048692 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.049032 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.059502 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29"] Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.152728 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vlk29\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.153135 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vlk29\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.153333 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vlk29\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.153520 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgmhn\" (UniqueName: \"kubernetes.io/projected/ec93f0f6-3753-4db2-a239-859abb202fdc-kube-api-access-wgmhn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vlk29\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.153703 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vlk29\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.255389 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vlk29\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.255532 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vlk29\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.255668 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vlk29\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.255752 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vlk29\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.255912 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgmhn\" (UniqueName: \"kubernetes.io/projected/ec93f0f6-3753-4db2-a239-859abb202fdc-kube-api-access-wgmhn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vlk29\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.263191 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vlk29\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.264047 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vlk29\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.264705 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vlk29\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.265627 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vlk29\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.300127 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgmhn\" (UniqueName: \"kubernetes.io/projected/ec93f0f6-3753-4db2-a239-859abb202fdc-kube-api-access-wgmhn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vlk29\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.364722 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 12:59:17 crc kubenswrapper[4975]: W0318 12:59:17.929448 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec93f0f6_3753_4db2_a239_859abb202fdc.slice/crio-6a3cabf9a00c53c688d59ad0b504687949d9cde8b6eef950ac4463297788ab3e WatchSource:0}: Error finding container 6a3cabf9a00c53c688d59ad0b504687949d9cde8b6eef950ac4463297788ab3e: Status 404 returned error can't find the container with id 6a3cabf9a00c53c688d59ad0b504687949d9cde8b6eef950ac4463297788ab3e Mar 18 12:59:17 crc kubenswrapper[4975]: I0318 12:59:17.931903 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29"] Mar 18 12:59:18 crc kubenswrapper[4975]: I0318 12:59:18.700743 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" event={"ID":"ec93f0f6-3753-4db2-a239-859abb202fdc","Type":"ContainerStarted","Data":"621dba3c4c975b5f7be39f6ec7b83d7f9b7809aa5b413fc05f0e423354318ddb"} Mar 18 12:59:18 crc kubenswrapper[4975]: I0318 12:59:18.701322 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" event={"ID":"ec93f0f6-3753-4db2-a239-859abb202fdc","Type":"ContainerStarted","Data":"6a3cabf9a00c53c688d59ad0b504687949d9cde8b6eef950ac4463297788ab3e"} Mar 18 12:59:18 crc kubenswrapper[4975]: I0318 12:59:18.732214 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" podStartSLOduration=1.511837799 podStartE2EDuration="1.732187297s" podCreationTimestamp="2026-03-18 12:59:17 +0000 UTC" firstStartedPulling="2026-03-18 12:59:17.934298137 +0000 UTC m=+2943.648698756" lastFinishedPulling="2026-03-18 12:59:18.154647655 +0000 UTC m=+2943.869048254" observedRunningTime="2026-03-18 12:59:18.723736137 +0000 UTC m=+2944.438136736" watchObservedRunningTime="2026-03-18 12:59:18.732187297 +0000 UTC m=+2944.446587886" Mar 18 12:59:25 crc kubenswrapper[4975]: I0318 12:59:25.539628 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 12:59:25 crc kubenswrapper[4975]: I0318 12:59:25.540451 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 12:59:25 crc kubenswrapper[4975]: I0318 12:59:25.540555 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 12:59:25 crc kubenswrapper[4975]: I0318 12:59:25.541476 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 12:59:25 crc kubenswrapper[4975]: I0318 12:59:25.541624 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" gracePeriod=600 Mar 18 12:59:25 crc kubenswrapper[4975]: E0318 12:59:25.674740 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:59:25 crc kubenswrapper[4975]: I0318 12:59:25.772044 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" exitCode=0 Mar 18 12:59:25 crc kubenswrapper[4975]: I0318 12:59:25.772116 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46"} Mar 18 12:59:25 crc kubenswrapper[4975]: I0318 12:59:25.772158 4975 scope.go:117] "RemoveContainer" containerID="ffa675b5d15e50ee5b5a9fe85b51627382b6e36a8ee201962ea64a4c8b81d353" Mar 18 12:59:25 crc kubenswrapper[4975]: I0318 12:59:25.773342 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 12:59:25 crc kubenswrapper[4975]: E0318 12:59:25.774001 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:59:38 crc kubenswrapper[4975]: I0318 12:59:38.017166 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 12:59:38 crc kubenswrapper[4975]: E0318 12:59:38.017894 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 12:59:53 crc kubenswrapper[4975]: I0318 12:59:53.016661 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 12:59:53 crc kubenswrapper[4975]: E0318 12:59:53.017423 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.190216 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563980-kg8bn"] Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.192240 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563980-kg8bn" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.194878 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.195643 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.197761 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.215977 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp"] Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.217419 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.224067 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.224331 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.230010 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp"] Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.240940 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563980-kg8bn"] Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.299455 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndfd7\" (UniqueName: \"kubernetes.io/projected/35843f4f-db77-4a4b-9a62-173a4884d774-kube-api-access-ndfd7\") pod \"collect-profiles-29563980-485pp\" (UID: \"35843f4f-db77-4a4b-9a62-173a4884d774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.299821 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35843f4f-db77-4a4b-9a62-173a4884d774-config-volume\") pod \"collect-profiles-29563980-485pp\" (UID: \"35843f4f-db77-4a4b-9a62-173a4884d774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.299992 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35843f4f-db77-4a4b-9a62-173a4884d774-secret-volume\") pod \"collect-profiles-29563980-485pp\" (UID: \"35843f4f-db77-4a4b-9a62-173a4884d774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.300092 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfsbv\" (UniqueName: \"kubernetes.io/projected/8c54475a-e898-4247-bcca-111f5072e571-kube-api-access-wfsbv\") pod \"auto-csr-approver-29563980-kg8bn\" (UID: \"8c54475a-e898-4247-bcca-111f5072e571\") " pod="openshift-infra/auto-csr-approver-29563980-kg8bn" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.401799 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35843f4f-db77-4a4b-9a62-173a4884d774-secret-volume\") pod \"collect-profiles-29563980-485pp\" (UID: \"35843f4f-db77-4a4b-9a62-173a4884d774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.401926 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfsbv\" (UniqueName: \"kubernetes.io/projected/8c54475a-e898-4247-bcca-111f5072e571-kube-api-access-wfsbv\") pod \"auto-csr-approver-29563980-kg8bn\" (UID: \"8c54475a-e898-4247-bcca-111f5072e571\") " pod="openshift-infra/auto-csr-approver-29563980-kg8bn" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.402009 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndfd7\" (UniqueName: \"kubernetes.io/projected/35843f4f-db77-4a4b-9a62-173a4884d774-kube-api-access-ndfd7\") pod \"collect-profiles-29563980-485pp\" (UID: \"35843f4f-db77-4a4b-9a62-173a4884d774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.402038 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35843f4f-db77-4a4b-9a62-173a4884d774-config-volume\") pod \"collect-profiles-29563980-485pp\" (UID: \"35843f4f-db77-4a4b-9a62-173a4884d774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.403036 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35843f4f-db77-4a4b-9a62-173a4884d774-config-volume\") pod \"collect-profiles-29563980-485pp\" (UID: \"35843f4f-db77-4a4b-9a62-173a4884d774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.410170 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35843f4f-db77-4a4b-9a62-173a4884d774-secret-volume\") pod \"collect-profiles-29563980-485pp\" (UID: \"35843f4f-db77-4a4b-9a62-173a4884d774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.422597 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndfd7\" (UniqueName: \"kubernetes.io/projected/35843f4f-db77-4a4b-9a62-173a4884d774-kube-api-access-ndfd7\") pod \"collect-profiles-29563980-485pp\" (UID: \"35843f4f-db77-4a4b-9a62-173a4884d774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.423114 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfsbv\" (UniqueName: \"kubernetes.io/projected/8c54475a-e898-4247-bcca-111f5072e571-kube-api-access-wfsbv\") pod \"auto-csr-approver-29563980-kg8bn\" (UID: \"8c54475a-e898-4247-bcca-111f5072e571\") " pod="openshift-infra/auto-csr-approver-29563980-kg8bn" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.520357 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563980-kg8bn" Mar 18 13:00:00 crc kubenswrapper[4975]: I0318 13:00:00.546458 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" Mar 18 13:00:01 crc kubenswrapper[4975]: I0318 13:00:01.052306 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563980-kg8bn"] Mar 18 13:00:01 crc kubenswrapper[4975]: I0318 13:00:01.116355 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp"] Mar 18 13:00:01 crc kubenswrapper[4975]: I0318 13:00:01.193475 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563980-kg8bn" event={"ID":"8c54475a-e898-4247-bcca-111f5072e571","Type":"ContainerStarted","Data":"b8770929fdcbc6c6a1d13fcf9ce56b35e09eae5e3f1ec354c8c572387aaa683b"} Mar 18 13:00:01 crc kubenswrapper[4975]: I0318 13:00:01.196274 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" event={"ID":"35843f4f-db77-4a4b-9a62-173a4884d774","Type":"ContainerStarted","Data":"53035ca93b41a5fba5929408f7699511e4c46c2b7b5044b6c0522da9b072a2f7"} Mar 18 13:00:02 crc kubenswrapper[4975]: I0318 13:00:02.208382 4975 generic.go:334] "Generic (PLEG): container finished" podID="35843f4f-db77-4a4b-9a62-173a4884d774" containerID="00375d6e0b9af6a4bf1ca4fafc2527b475fa0ed8cf7c2c036829780dc71f5136" exitCode=0 Mar 18 13:00:02 crc kubenswrapper[4975]: I0318 13:00:02.208460 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" event={"ID":"35843f4f-db77-4a4b-9a62-173a4884d774","Type":"ContainerDied","Data":"00375d6e0b9af6a4bf1ca4fafc2527b475fa0ed8cf7c2c036829780dc71f5136"} Mar 18 13:00:03 crc kubenswrapper[4975]: I0318 13:00:03.550990 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" Mar 18 13:00:03 crc kubenswrapper[4975]: I0318 13:00:03.661705 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35843f4f-db77-4a4b-9a62-173a4884d774-secret-volume\") pod \"35843f4f-db77-4a4b-9a62-173a4884d774\" (UID: \"35843f4f-db77-4a4b-9a62-173a4884d774\") " Mar 18 13:00:03 crc kubenswrapper[4975]: I0318 13:00:03.661830 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35843f4f-db77-4a4b-9a62-173a4884d774-config-volume\") pod \"35843f4f-db77-4a4b-9a62-173a4884d774\" (UID: \"35843f4f-db77-4a4b-9a62-173a4884d774\") " Mar 18 13:00:03 crc kubenswrapper[4975]: I0318 13:00:03.661909 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndfd7\" (UniqueName: \"kubernetes.io/projected/35843f4f-db77-4a4b-9a62-173a4884d774-kube-api-access-ndfd7\") pod \"35843f4f-db77-4a4b-9a62-173a4884d774\" (UID: \"35843f4f-db77-4a4b-9a62-173a4884d774\") " Mar 18 13:00:03 crc kubenswrapper[4975]: I0318 13:00:03.662514 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35843f4f-db77-4a4b-9a62-173a4884d774-config-volume" (OuterVolumeSpecName: "config-volume") pod "35843f4f-db77-4a4b-9a62-173a4884d774" (UID: "35843f4f-db77-4a4b-9a62-173a4884d774"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:00:03 crc kubenswrapper[4975]: I0318 13:00:03.668058 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35843f4f-db77-4a4b-9a62-173a4884d774-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "35843f4f-db77-4a4b-9a62-173a4884d774" (UID: "35843f4f-db77-4a4b-9a62-173a4884d774"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:00:03 crc kubenswrapper[4975]: I0318 13:00:03.668719 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35843f4f-db77-4a4b-9a62-173a4884d774-kube-api-access-ndfd7" (OuterVolumeSpecName: "kube-api-access-ndfd7") pod "35843f4f-db77-4a4b-9a62-173a4884d774" (UID: "35843f4f-db77-4a4b-9a62-173a4884d774"). InnerVolumeSpecName "kube-api-access-ndfd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:00:03 crc kubenswrapper[4975]: I0318 13:00:03.764722 4975 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35843f4f-db77-4a4b-9a62-173a4884d774-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:00:03 crc kubenswrapper[4975]: I0318 13:00:03.764767 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndfd7\" (UniqueName: \"kubernetes.io/projected/35843f4f-db77-4a4b-9a62-173a4884d774-kube-api-access-ndfd7\") on node \"crc\" DevicePath \"\"" Mar 18 13:00:03 crc kubenswrapper[4975]: I0318 13:00:03.764779 4975 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35843f4f-db77-4a4b-9a62-173a4884d774-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:00:04 crc kubenswrapper[4975]: I0318 13:00:04.041353 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:00:04 crc kubenswrapper[4975]: E0318 13:00:04.041598 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:00:04 crc kubenswrapper[4975]: I0318 13:00:04.227778 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" event={"ID":"35843f4f-db77-4a4b-9a62-173a4884d774","Type":"ContainerDied","Data":"53035ca93b41a5fba5929408f7699511e4c46c2b7b5044b6c0522da9b072a2f7"} Mar 18 13:00:04 crc kubenswrapper[4975]: I0318 13:00:04.228210 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53035ca93b41a5fba5929408f7699511e4c46c2b7b5044b6c0522da9b072a2f7" Mar 18 13:00:04 crc kubenswrapper[4975]: I0318 13:00:04.227915 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp" Mar 18 13:00:04 crc kubenswrapper[4975]: E0318 13:00:04.339305 4975 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35843f4f_db77_4a4b_9a62_173a4884d774.slice\": RecentStats: unable to find data in memory cache]" Mar 18 13:00:04 crc kubenswrapper[4975]: I0318 13:00:04.647773 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8"] Mar 18 13:00:04 crc kubenswrapper[4975]: I0318 13:00:04.658471 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563935-48zj8"] Mar 18 13:00:05 crc kubenswrapper[4975]: I0318 13:00:05.027387 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85462245-ccc9-46f5-8bcb-a2648e9f1488" path="/var/lib/kubelet/pods/85462245-ccc9-46f5-8bcb-a2648e9f1488/volumes" Mar 18 13:00:05 crc kubenswrapper[4975]: I0318 13:00:05.236895 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563980-kg8bn" event={"ID":"8c54475a-e898-4247-bcca-111f5072e571","Type":"ContainerStarted","Data":"71f3f594e4ee934fdbde53fec5669ee35feccbb04839b9be53fe40928b76c99f"} Mar 18 13:00:05 crc kubenswrapper[4975]: I0318 13:00:05.254517 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563980-kg8bn" podStartSLOduration=1.490818919 podStartE2EDuration="5.25448873s" podCreationTimestamp="2026-03-18 13:00:00 +0000 UTC" firstStartedPulling="2026-03-18 13:00:01.030670533 +0000 UTC m=+2986.745071112" lastFinishedPulling="2026-03-18 13:00:04.794340334 +0000 UTC m=+2990.508740923" observedRunningTime="2026-03-18 13:00:05.253078932 +0000 UTC m=+2990.967479521" watchObservedRunningTime="2026-03-18 13:00:05.25448873 +0000 UTC m=+2990.968889309" Mar 18 13:00:06 crc kubenswrapper[4975]: I0318 13:00:06.245772 4975 generic.go:334] "Generic (PLEG): container finished" podID="8c54475a-e898-4247-bcca-111f5072e571" containerID="71f3f594e4ee934fdbde53fec5669ee35feccbb04839b9be53fe40928b76c99f" exitCode=0 Mar 18 13:00:06 crc kubenswrapper[4975]: I0318 13:00:06.245850 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563980-kg8bn" event={"ID":"8c54475a-e898-4247-bcca-111f5072e571","Type":"ContainerDied","Data":"71f3f594e4ee934fdbde53fec5669ee35feccbb04839b9be53fe40928b76c99f"} Mar 18 13:00:07 crc kubenswrapper[4975]: I0318 13:00:07.571464 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563980-kg8bn" Mar 18 13:00:07 crc kubenswrapper[4975]: I0318 13:00:07.640602 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfsbv\" (UniqueName: \"kubernetes.io/projected/8c54475a-e898-4247-bcca-111f5072e571-kube-api-access-wfsbv\") pod \"8c54475a-e898-4247-bcca-111f5072e571\" (UID: \"8c54475a-e898-4247-bcca-111f5072e571\") " Mar 18 13:00:07 crc kubenswrapper[4975]: I0318 13:00:07.655206 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c54475a-e898-4247-bcca-111f5072e571-kube-api-access-wfsbv" (OuterVolumeSpecName: "kube-api-access-wfsbv") pod "8c54475a-e898-4247-bcca-111f5072e571" (UID: "8c54475a-e898-4247-bcca-111f5072e571"). InnerVolumeSpecName "kube-api-access-wfsbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:00:07 crc kubenswrapper[4975]: I0318 13:00:07.743126 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfsbv\" (UniqueName: \"kubernetes.io/projected/8c54475a-e898-4247-bcca-111f5072e571-kube-api-access-wfsbv\") on node \"crc\" DevicePath \"\"" Mar 18 13:00:08 crc kubenswrapper[4975]: I0318 13:00:08.095635 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563974-qfqld"] Mar 18 13:00:08 crc kubenswrapper[4975]: I0318 13:00:08.102940 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563974-qfqld"] Mar 18 13:00:08 crc kubenswrapper[4975]: I0318 13:00:08.266149 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563980-kg8bn" event={"ID":"8c54475a-e898-4247-bcca-111f5072e571","Type":"ContainerDied","Data":"b8770929fdcbc6c6a1d13fcf9ce56b35e09eae5e3f1ec354c8c572387aaa683b"} Mar 18 13:00:08 crc kubenswrapper[4975]: I0318 13:00:08.266186 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8770929fdcbc6c6a1d13fcf9ce56b35e09eae5e3f1ec354c8c572387aaa683b" Mar 18 13:00:08 crc kubenswrapper[4975]: I0318 13:00:08.266212 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563980-kg8bn" Mar 18 13:00:09 crc kubenswrapper[4975]: I0318 13:00:09.027361 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d1b19c8-648a-4152-937f-93b1a5aea846" path="/var/lib/kubelet/pods/9d1b19c8-648a-4152-937f-93b1a5aea846/volumes" Mar 18 13:00:15 crc kubenswrapper[4975]: I0318 13:00:15.044181 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:00:15 crc kubenswrapper[4975]: E0318 13:00:15.045430 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:00:29 crc kubenswrapper[4975]: I0318 13:00:29.016657 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:00:29 crc kubenswrapper[4975]: E0318 13:00:29.017456 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:00:44 crc kubenswrapper[4975]: I0318 13:00:44.017573 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:00:44 crc kubenswrapper[4975]: E0318 13:00:44.018705 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:00:48 crc kubenswrapper[4975]: I0318 13:00:48.130388 4975 scope.go:117] "RemoveContainer" containerID="c17f49e77a8b084025194c527f869359d4e5e79b9fe21442370a2f004d4e9d6f" Mar 18 13:00:48 crc kubenswrapper[4975]: I0318 13:00:48.203841 4975 scope.go:117] "RemoveContainer" containerID="9b76c3ce1af627eabe894e32aa1dee49016d590c39ebd040ec8c20cde28b1604" Mar 18 13:00:57 crc kubenswrapper[4975]: I0318 13:00:57.016701 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:00:57 crc kubenswrapper[4975]: E0318 13:00:57.017463 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.155604 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29563981-gj4mp"] Mar 18 13:01:00 crc kubenswrapper[4975]: E0318 13:01:00.157263 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35843f4f-db77-4a4b-9a62-173a4884d774" containerName="collect-profiles" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.157305 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="35843f4f-db77-4a4b-9a62-173a4884d774" containerName="collect-profiles" Mar 18 13:01:00 crc kubenswrapper[4975]: E0318 13:01:00.157319 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c54475a-e898-4247-bcca-111f5072e571" containerName="oc" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.157339 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c54475a-e898-4247-bcca-111f5072e571" containerName="oc" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.157618 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c54475a-e898-4247-bcca-111f5072e571" containerName="oc" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.157655 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="35843f4f-db77-4a4b-9a62-173a4884d774" containerName="collect-profiles" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.158469 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.168293 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29563981-gj4mp"] Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.297476 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-fernet-keys\") pod \"keystone-cron-29563981-gj4mp\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.297544 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-combined-ca-bundle\") pod \"keystone-cron-29563981-gj4mp\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.297578 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgxv7\" (UniqueName: \"kubernetes.io/projected/5b92e883-e662-4970-ab9b-df31247d4cb7-kube-api-access-rgxv7\") pod \"keystone-cron-29563981-gj4mp\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.297770 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-config-data\") pod \"keystone-cron-29563981-gj4mp\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.399228 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-config-data\") pod \"keystone-cron-29563981-gj4mp\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.412817 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-fernet-keys\") pod \"keystone-cron-29563981-gj4mp\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.412948 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-combined-ca-bundle\") pod \"keystone-cron-29563981-gj4mp\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.412991 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgxv7\" (UniqueName: \"kubernetes.io/projected/5b92e883-e662-4970-ab9b-df31247d4cb7-kube-api-access-rgxv7\") pod \"keystone-cron-29563981-gj4mp\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.420568 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-config-data\") pod \"keystone-cron-29563981-gj4mp\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.423541 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-fernet-keys\") pod \"keystone-cron-29563981-gj4mp\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.424697 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-combined-ca-bundle\") pod \"keystone-cron-29563981-gj4mp\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.432398 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgxv7\" (UniqueName: \"kubernetes.io/projected/5b92e883-e662-4970-ab9b-df31247d4cb7-kube-api-access-rgxv7\") pod \"keystone-cron-29563981-gj4mp\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:00 crc kubenswrapper[4975]: I0318 13:01:00.531044 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.092981 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29563981-gj4mp"] Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.137579 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nr24h"] Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.154703 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nr24h"] Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.154802 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.261048 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-745mj\" (UniqueName: \"kubernetes.io/projected/0d42fee4-b8c7-4c37-9494-cac96a9a8249-kube-api-access-745mj\") pod \"community-operators-nr24h\" (UID: \"0d42fee4-b8c7-4c37-9494-cac96a9a8249\") " pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.261341 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d42fee4-b8c7-4c37-9494-cac96a9a8249-catalog-content\") pod \"community-operators-nr24h\" (UID: \"0d42fee4-b8c7-4c37-9494-cac96a9a8249\") " pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.261412 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d42fee4-b8c7-4c37-9494-cac96a9a8249-utilities\") pod \"community-operators-nr24h\" (UID: \"0d42fee4-b8c7-4c37-9494-cac96a9a8249\") " pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.363114 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-745mj\" (UniqueName: \"kubernetes.io/projected/0d42fee4-b8c7-4c37-9494-cac96a9a8249-kube-api-access-745mj\") pod \"community-operators-nr24h\" (UID: \"0d42fee4-b8c7-4c37-9494-cac96a9a8249\") " pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.363157 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d42fee4-b8c7-4c37-9494-cac96a9a8249-catalog-content\") pod \"community-operators-nr24h\" (UID: \"0d42fee4-b8c7-4c37-9494-cac96a9a8249\") " pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.363206 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d42fee4-b8c7-4c37-9494-cac96a9a8249-utilities\") pod \"community-operators-nr24h\" (UID: \"0d42fee4-b8c7-4c37-9494-cac96a9a8249\") " pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.363689 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d42fee4-b8c7-4c37-9494-cac96a9a8249-utilities\") pod \"community-operators-nr24h\" (UID: \"0d42fee4-b8c7-4c37-9494-cac96a9a8249\") " pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.363909 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d42fee4-b8c7-4c37-9494-cac96a9a8249-catalog-content\") pod \"community-operators-nr24h\" (UID: \"0d42fee4-b8c7-4c37-9494-cac96a9a8249\") " pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.380787 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-745mj\" (UniqueName: \"kubernetes.io/projected/0d42fee4-b8c7-4c37-9494-cac96a9a8249-kube-api-access-745mj\") pod \"community-operators-nr24h\" (UID: \"0d42fee4-b8c7-4c37-9494-cac96a9a8249\") " pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.476176 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.794059 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nr24h"] Mar 18 13:01:01 crc kubenswrapper[4975]: W0318 13:01:01.799728 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d42fee4_b8c7_4c37_9494_cac96a9a8249.slice/crio-666fc6b65acaf935fa591e701f3bc9eaa6ed13f9a3e138979d05e3156df1d2f1 WatchSource:0}: Error finding container 666fc6b65acaf935fa591e701f3bc9eaa6ed13f9a3e138979d05e3156df1d2f1: Status 404 returned error can't find the container with id 666fc6b65acaf935fa591e701f3bc9eaa6ed13f9a3e138979d05e3156df1d2f1 Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.829165 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nr24h" event={"ID":"0d42fee4-b8c7-4c37-9494-cac96a9a8249","Type":"ContainerStarted","Data":"666fc6b65acaf935fa591e701f3bc9eaa6ed13f9a3e138979d05e3156df1d2f1"} Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.830534 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563981-gj4mp" event={"ID":"5b92e883-e662-4970-ab9b-df31247d4cb7","Type":"ContainerStarted","Data":"915c030b8a58fc7be8053d2ff10599606f6172a30d634fad0a685c21e5978f1f"} Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.830560 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563981-gj4mp" event={"ID":"5b92e883-e662-4970-ab9b-df31247d4cb7","Type":"ContainerStarted","Data":"f9778cc1647ea17762402808d64934351d24711a1af17c891d64e5197c16f084"} Mar 18 13:01:01 crc kubenswrapper[4975]: I0318 13:01:01.858602 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29563981-gj4mp" podStartSLOduration=1.85858621 podStartE2EDuration="1.85858621s" podCreationTimestamp="2026-03-18 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:01:01.853284755 +0000 UTC m=+3047.567685344" watchObservedRunningTime="2026-03-18 13:01:01.85858621 +0000 UTC m=+3047.572986789" Mar 18 13:01:02 crc kubenswrapper[4975]: I0318 13:01:02.848066 4975 generic.go:334] "Generic (PLEG): container finished" podID="0d42fee4-b8c7-4c37-9494-cac96a9a8249" containerID="0758ab60eaf68c6b88ca0f19653cfead2b416675525e9cf6be0a9f20fa3eba4b" exitCode=0 Mar 18 13:01:02 crc kubenswrapper[4975]: I0318 13:01:02.848182 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nr24h" event={"ID":"0d42fee4-b8c7-4c37-9494-cac96a9a8249","Type":"ContainerDied","Data":"0758ab60eaf68c6b88ca0f19653cfead2b416675525e9cf6be0a9f20fa3eba4b"} Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.706493 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-79mx7"] Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.714647 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.718151 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-79mx7"] Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.859379 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nr24h" event={"ID":"0d42fee4-b8c7-4c37-9494-cac96a9a8249","Type":"ContainerStarted","Data":"35a26e242ec8c7c94e21fd915881bc6565a96a31b6979e1372201c4b968a282a"} Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.861204 4975 generic.go:334] "Generic (PLEG): container finished" podID="5b92e883-e662-4970-ab9b-df31247d4cb7" containerID="915c030b8a58fc7be8053d2ff10599606f6172a30d634fad0a685c21e5978f1f" exitCode=0 Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.861297 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563981-gj4mp" event={"ID":"5b92e883-e662-4970-ab9b-df31247d4cb7","Type":"ContainerDied","Data":"915c030b8a58fc7be8053d2ff10599606f6172a30d634fad0a685c21e5978f1f"} Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.863640 4975 generic.go:334] "Generic (PLEG): container finished" podID="ec93f0f6-3753-4db2-a239-859abb202fdc" containerID="621dba3c4c975b5f7be39f6ec7b83d7f9b7809aa5b413fc05f0e423354318ddb" exitCode=2 Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.863672 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" event={"ID":"ec93f0f6-3753-4db2-a239-859abb202fdc","Type":"ContainerDied","Data":"621dba3c4c975b5f7be39f6ec7b83d7f9b7809aa5b413fc05f0e423354318ddb"} Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.864596 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pscb\" (UniqueName: \"kubernetes.io/projected/d30a7455-009a-465c-b5e1-e053639b0cb1-kube-api-access-7pscb\") pod \"redhat-marketplace-79mx7\" (UID: \"d30a7455-009a-465c-b5e1-e053639b0cb1\") " pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.864997 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d30a7455-009a-465c-b5e1-e053639b0cb1-utilities\") pod \"redhat-marketplace-79mx7\" (UID: \"d30a7455-009a-465c-b5e1-e053639b0cb1\") " pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.865346 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d30a7455-009a-465c-b5e1-e053639b0cb1-catalog-content\") pod \"redhat-marketplace-79mx7\" (UID: \"d30a7455-009a-465c-b5e1-e053639b0cb1\") " pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.966514 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pscb\" (UniqueName: \"kubernetes.io/projected/d30a7455-009a-465c-b5e1-e053639b0cb1-kube-api-access-7pscb\") pod \"redhat-marketplace-79mx7\" (UID: \"d30a7455-009a-465c-b5e1-e053639b0cb1\") " pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.966618 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d30a7455-009a-465c-b5e1-e053639b0cb1-utilities\") pod \"redhat-marketplace-79mx7\" (UID: \"d30a7455-009a-465c-b5e1-e053639b0cb1\") " pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.966703 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d30a7455-009a-465c-b5e1-e053639b0cb1-catalog-content\") pod \"redhat-marketplace-79mx7\" (UID: \"d30a7455-009a-465c-b5e1-e053639b0cb1\") " pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.967204 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d30a7455-009a-465c-b5e1-e053639b0cb1-utilities\") pod \"redhat-marketplace-79mx7\" (UID: \"d30a7455-009a-465c-b5e1-e053639b0cb1\") " pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.967239 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d30a7455-009a-465c-b5e1-e053639b0cb1-catalog-content\") pod \"redhat-marketplace-79mx7\" (UID: \"d30a7455-009a-465c-b5e1-e053639b0cb1\") " pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:03 crc kubenswrapper[4975]: I0318 13:01:03.986162 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pscb\" (UniqueName: \"kubernetes.io/projected/d30a7455-009a-465c-b5e1-e053639b0cb1-kube-api-access-7pscb\") pod \"redhat-marketplace-79mx7\" (UID: \"d30a7455-009a-465c-b5e1-e053639b0cb1\") " pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:04 crc kubenswrapper[4975]: I0318 13:01:04.062554 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:04 crc kubenswrapper[4975]: I0318 13:01:04.524140 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-79mx7"] Mar 18 13:01:04 crc kubenswrapper[4975]: W0318 13:01:04.526242 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd30a7455_009a_465c_b5e1_e053639b0cb1.slice/crio-71b59132a8bbb99c6b2fcb8525d8b3baf693106c5b123d66a4e99c4f2cf98b4b WatchSource:0}: Error finding container 71b59132a8bbb99c6b2fcb8525d8b3baf693106c5b123d66a4e99c4f2cf98b4b: Status 404 returned error can't find the container with id 71b59132a8bbb99c6b2fcb8525d8b3baf693106c5b123d66a4e99c4f2cf98b4b Mar 18 13:01:04 crc kubenswrapper[4975]: I0318 13:01:04.873748 4975 generic.go:334] "Generic (PLEG): container finished" podID="d30a7455-009a-465c-b5e1-e053639b0cb1" containerID="db9cc847e72067fac5206af8f156635b3024cf109ec3756a0247a0a7fecbc295" exitCode=0 Mar 18 13:01:04 crc kubenswrapper[4975]: I0318 13:01:04.873790 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79mx7" event={"ID":"d30a7455-009a-465c-b5e1-e053639b0cb1","Type":"ContainerDied","Data":"db9cc847e72067fac5206af8f156635b3024cf109ec3756a0247a0a7fecbc295"} Mar 18 13:01:04 crc kubenswrapper[4975]: I0318 13:01:04.874109 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79mx7" event={"ID":"d30a7455-009a-465c-b5e1-e053639b0cb1","Type":"ContainerStarted","Data":"71b59132a8bbb99c6b2fcb8525d8b3baf693106c5b123d66a4e99c4f2cf98b4b"} Mar 18 13:01:04 crc kubenswrapper[4975]: I0318 13:01:04.876966 4975 generic.go:334] "Generic (PLEG): container finished" podID="0d42fee4-b8c7-4c37-9494-cac96a9a8249" containerID="35a26e242ec8c7c94e21fd915881bc6565a96a31b6979e1372201c4b968a282a" exitCode=0 Mar 18 13:01:04 crc kubenswrapper[4975]: I0318 13:01:04.877015 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nr24h" event={"ID":"0d42fee4-b8c7-4c37-9494-cac96a9a8249","Type":"ContainerDied","Data":"35a26e242ec8c7c94e21fd915881bc6565a96a31b6979e1372201c4b968a282a"} Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.363797 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.370156 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.498274 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-fernet-keys\") pod \"5b92e883-e662-4970-ab9b-df31247d4cb7\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.498551 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgxv7\" (UniqueName: \"kubernetes.io/projected/5b92e883-e662-4970-ab9b-df31247d4cb7-kube-api-access-rgxv7\") pod \"5b92e883-e662-4970-ab9b-df31247d4cb7\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.498650 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-inventory\") pod \"ec93f0f6-3753-4db2-a239-859abb202fdc\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.498677 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-libvirt-secret-0\") pod \"ec93f0f6-3753-4db2-a239-859abb202fdc\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.498756 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-libvirt-combined-ca-bundle\") pod \"ec93f0f6-3753-4db2-a239-859abb202fdc\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.498792 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-config-data\") pod \"5b92e883-e662-4970-ab9b-df31247d4cb7\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.498842 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-ssh-key-openstack-edpm-ipam\") pod \"ec93f0f6-3753-4db2-a239-859abb202fdc\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.498879 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-combined-ca-bundle\") pod \"5b92e883-e662-4970-ab9b-df31247d4cb7\" (UID: \"5b92e883-e662-4970-ab9b-df31247d4cb7\") " Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.498914 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgmhn\" (UniqueName: \"kubernetes.io/projected/ec93f0f6-3753-4db2-a239-859abb202fdc-kube-api-access-wgmhn\") pod \"ec93f0f6-3753-4db2-a239-859abb202fdc\" (UID: \"ec93f0f6-3753-4db2-a239-859abb202fdc\") " Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.505217 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec93f0f6-3753-4db2-a239-859abb202fdc-kube-api-access-wgmhn" (OuterVolumeSpecName: "kube-api-access-wgmhn") pod "ec93f0f6-3753-4db2-a239-859abb202fdc" (UID: "ec93f0f6-3753-4db2-a239-859abb202fdc"). InnerVolumeSpecName "kube-api-access-wgmhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.505833 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "ec93f0f6-3753-4db2-a239-859abb202fdc" (UID: "ec93f0f6-3753-4db2-a239-859abb202fdc"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.511131 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5b92e883-e662-4970-ab9b-df31247d4cb7" (UID: "5b92e883-e662-4970-ab9b-df31247d4cb7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.519163 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b92e883-e662-4970-ab9b-df31247d4cb7-kube-api-access-rgxv7" (OuterVolumeSpecName: "kube-api-access-rgxv7") pod "5b92e883-e662-4970-ab9b-df31247d4cb7" (UID: "5b92e883-e662-4970-ab9b-df31247d4cb7"). InnerVolumeSpecName "kube-api-access-rgxv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.530224 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ec93f0f6-3753-4db2-a239-859abb202fdc" (UID: "ec93f0f6-3753-4db2-a239-859abb202fdc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.533593 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "ec93f0f6-3753-4db2-a239-859abb202fdc" (UID: "ec93f0f6-3753-4db2-a239-859abb202fdc"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.549713 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-inventory" (OuterVolumeSpecName: "inventory") pod "ec93f0f6-3753-4db2-a239-859abb202fdc" (UID: "ec93f0f6-3753-4db2-a239-859abb202fdc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.559218 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b92e883-e662-4970-ab9b-df31247d4cb7" (UID: "5b92e883-e662-4970-ab9b-df31247d4cb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.568732 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-config-data" (OuterVolumeSpecName: "config-data") pod "5b92e883-e662-4970-ab9b-df31247d4cb7" (UID: "5b92e883-e662-4970-ab9b-df31247d4cb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.702729 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.704363 4975 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.704497 4975 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.704663 4975 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.704740 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec93f0f6-3753-4db2-a239-859abb202fdc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.704813 4975 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.704923 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgmhn\" (UniqueName: \"kubernetes.io/projected/ec93f0f6-3753-4db2-a239-859abb202fdc-kube-api-access-wgmhn\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.705284 4975 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b92e883-e662-4970-ab9b-df31247d4cb7-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.705448 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgxv7\" (UniqueName: \"kubernetes.io/projected/5b92e883-e662-4970-ab9b-df31247d4cb7-kube-api-access-rgxv7\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.893442 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" event={"ID":"ec93f0f6-3753-4db2-a239-859abb202fdc","Type":"ContainerDied","Data":"6a3cabf9a00c53c688d59ad0b504687949d9cde8b6eef950ac4463297788ab3e"} Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.893807 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a3cabf9a00c53c688d59ad0b504687949d9cde8b6eef950ac4463297788ab3e" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.893548 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vlk29" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.895731 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563981-gj4mp" event={"ID":"5b92e883-e662-4970-ab9b-df31247d4cb7","Type":"ContainerDied","Data":"f9778cc1647ea17762402808d64934351d24711a1af17c891d64e5197c16f084"} Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.895771 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9778cc1647ea17762402808d64934351d24711a1af17c891d64e5197c16f084" Mar 18 13:01:05 crc kubenswrapper[4975]: I0318 13:01:05.895820 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563981-gj4mp" Mar 18 13:01:06 crc kubenswrapper[4975]: I0318 13:01:06.906224 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nr24h" event={"ID":"0d42fee4-b8c7-4c37-9494-cac96a9a8249","Type":"ContainerStarted","Data":"dd671b1ac9de9db1e6ce41c80bbcff5dad2e08ebda8148e084b5574557e1572c"} Mar 18 13:01:06 crc kubenswrapper[4975]: I0318 13:01:06.908719 4975 generic.go:334] "Generic (PLEG): container finished" podID="d30a7455-009a-465c-b5e1-e053639b0cb1" containerID="197601e98af4cb84ec250fa02ce4a91176a8a6de273fccf7d8cfd3de3ae1c41b" exitCode=0 Mar 18 13:01:06 crc kubenswrapper[4975]: I0318 13:01:06.908764 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79mx7" event={"ID":"d30a7455-009a-465c-b5e1-e053639b0cb1","Type":"ContainerDied","Data":"197601e98af4cb84ec250fa02ce4a91176a8a6de273fccf7d8cfd3de3ae1c41b"} Mar 18 13:01:06 crc kubenswrapper[4975]: I0318 13:01:06.950562 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nr24h" podStartSLOduration=3.281360699 podStartE2EDuration="5.950532548s" podCreationTimestamp="2026-03-18 13:01:01 +0000 UTC" firstStartedPulling="2026-03-18 13:01:02.876238941 +0000 UTC m=+3048.590639540" lastFinishedPulling="2026-03-18 13:01:05.54541081 +0000 UTC m=+3051.259811389" observedRunningTime="2026-03-18 13:01:06.940373232 +0000 UTC m=+3052.654773821" watchObservedRunningTime="2026-03-18 13:01:06.950532548 +0000 UTC m=+3052.664933137" Mar 18 13:01:08 crc kubenswrapper[4975]: I0318 13:01:08.016736 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:01:08 crc kubenswrapper[4975]: E0318 13:01:08.017287 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:01:08 crc kubenswrapper[4975]: I0318 13:01:08.928068 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79mx7" event={"ID":"d30a7455-009a-465c-b5e1-e053639b0cb1","Type":"ContainerStarted","Data":"aac3fe9a1f0beaa747fb71ab94ef9f8fd1116b977878e8947e7d4966617a7359"} Mar 18 13:01:08 crc kubenswrapper[4975]: I0318 13:01:08.954686 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-79mx7" podStartSLOduration=3.111810657 podStartE2EDuration="5.954665003s" podCreationTimestamp="2026-03-18 13:01:03 +0000 UTC" firstStartedPulling="2026-03-18 13:01:04.875258627 +0000 UTC m=+3050.589659216" lastFinishedPulling="2026-03-18 13:01:07.718112983 +0000 UTC m=+3053.432513562" observedRunningTime="2026-03-18 13:01:08.946242594 +0000 UTC m=+3054.660643183" watchObservedRunningTime="2026-03-18 13:01:08.954665003 +0000 UTC m=+3054.669065582" Mar 18 13:01:11 crc kubenswrapper[4975]: I0318 13:01:11.476283 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:11 crc kubenswrapper[4975]: I0318 13:01:11.476589 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:11 crc kubenswrapper[4975]: I0318 13:01:11.569029 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:12 crc kubenswrapper[4975]: I0318 13:01:12.027996 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:12 crc kubenswrapper[4975]: I0318 13:01:12.710785 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nr24h"] Mar 18 13:01:13 crc kubenswrapper[4975]: I0318 13:01:13.995388 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nr24h" podUID="0d42fee4-b8c7-4c37-9494-cac96a9a8249" containerName="registry-server" containerID="cri-o://dd671b1ac9de9db1e6ce41c80bbcff5dad2e08ebda8148e084b5574557e1572c" gracePeriod=2 Mar 18 13:01:14 crc kubenswrapper[4975]: I0318 13:01:14.062908 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:14 crc kubenswrapper[4975]: I0318 13:01:14.063333 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:14 crc kubenswrapper[4975]: I0318 13:01:14.124354 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:14 crc kubenswrapper[4975]: I0318 13:01:14.977292 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.008548 4975 generic.go:334] "Generic (PLEG): container finished" podID="0d42fee4-b8c7-4c37-9494-cac96a9a8249" containerID="dd671b1ac9de9db1e6ce41c80bbcff5dad2e08ebda8148e084b5574557e1572c" exitCode=0 Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.008989 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nr24h" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.009009 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nr24h" event={"ID":"0d42fee4-b8c7-4c37-9494-cac96a9a8249","Type":"ContainerDied","Data":"dd671b1ac9de9db1e6ce41c80bbcff5dad2e08ebda8148e084b5574557e1572c"} Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.009079 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nr24h" event={"ID":"0d42fee4-b8c7-4c37-9494-cac96a9a8249","Type":"ContainerDied","Data":"666fc6b65acaf935fa591e701f3bc9eaa6ed13f9a3e138979d05e3156df1d2f1"} Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.009105 4975 scope.go:117] "RemoveContainer" containerID="dd671b1ac9de9db1e6ce41c80bbcff5dad2e08ebda8148e084b5574557e1572c" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.041522 4975 scope.go:117] "RemoveContainer" containerID="35a26e242ec8c7c94e21fd915881bc6565a96a31b6979e1372201c4b968a282a" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.070005 4975 scope.go:117] "RemoveContainer" containerID="0758ab60eaf68c6b88ca0f19653cfead2b416675525e9cf6be0a9f20fa3eba4b" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.073516 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.115207 4975 scope.go:117] "RemoveContainer" containerID="dd671b1ac9de9db1e6ce41c80bbcff5dad2e08ebda8148e084b5574557e1572c" Mar 18 13:01:15 crc kubenswrapper[4975]: E0318 13:01:15.116989 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd671b1ac9de9db1e6ce41c80bbcff5dad2e08ebda8148e084b5574557e1572c\": container with ID starting with dd671b1ac9de9db1e6ce41c80bbcff5dad2e08ebda8148e084b5574557e1572c not found: ID does not exist" containerID="dd671b1ac9de9db1e6ce41c80bbcff5dad2e08ebda8148e084b5574557e1572c" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.117033 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd671b1ac9de9db1e6ce41c80bbcff5dad2e08ebda8148e084b5574557e1572c"} err="failed to get container status \"dd671b1ac9de9db1e6ce41c80bbcff5dad2e08ebda8148e084b5574557e1572c\": rpc error: code = NotFound desc = could not find container \"dd671b1ac9de9db1e6ce41c80bbcff5dad2e08ebda8148e084b5574557e1572c\": container with ID starting with dd671b1ac9de9db1e6ce41c80bbcff5dad2e08ebda8148e084b5574557e1572c not found: ID does not exist" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.117056 4975 scope.go:117] "RemoveContainer" containerID="35a26e242ec8c7c94e21fd915881bc6565a96a31b6979e1372201c4b968a282a" Mar 18 13:01:15 crc kubenswrapper[4975]: E0318 13:01:15.117504 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35a26e242ec8c7c94e21fd915881bc6565a96a31b6979e1372201c4b968a282a\": container with ID starting with 35a26e242ec8c7c94e21fd915881bc6565a96a31b6979e1372201c4b968a282a not found: ID does not exist" containerID="35a26e242ec8c7c94e21fd915881bc6565a96a31b6979e1372201c4b968a282a" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.117544 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35a26e242ec8c7c94e21fd915881bc6565a96a31b6979e1372201c4b968a282a"} err="failed to get container status \"35a26e242ec8c7c94e21fd915881bc6565a96a31b6979e1372201c4b968a282a\": rpc error: code = NotFound desc = could not find container \"35a26e242ec8c7c94e21fd915881bc6565a96a31b6979e1372201c4b968a282a\": container with ID starting with 35a26e242ec8c7c94e21fd915881bc6565a96a31b6979e1372201c4b968a282a not found: ID does not exist" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.117573 4975 scope.go:117] "RemoveContainer" containerID="0758ab60eaf68c6b88ca0f19653cfead2b416675525e9cf6be0a9f20fa3eba4b" Mar 18 13:01:15 crc kubenswrapper[4975]: E0318 13:01:15.118076 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0758ab60eaf68c6b88ca0f19653cfead2b416675525e9cf6be0a9f20fa3eba4b\": container with ID starting with 0758ab60eaf68c6b88ca0f19653cfead2b416675525e9cf6be0a9f20fa3eba4b not found: ID does not exist" containerID="0758ab60eaf68c6b88ca0f19653cfead2b416675525e9cf6be0a9f20fa3eba4b" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.118111 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0758ab60eaf68c6b88ca0f19653cfead2b416675525e9cf6be0a9f20fa3eba4b"} err="failed to get container status \"0758ab60eaf68c6b88ca0f19653cfead2b416675525e9cf6be0a9f20fa3eba4b\": rpc error: code = NotFound desc = could not find container \"0758ab60eaf68c6b88ca0f19653cfead2b416675525e9cf6be0a9f20fa3eba4b\": container with ID starting with 0758ab60eaf68c6b88ca0f19653cfead2b416675525e9cf6be0a9f20fa3eba4b not found: ID does not exist" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.152652 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-745mj\" (UniqueName: \"kubernetes.io/projected/0d42fee4-b8c7-4c37-9494-cac96a9a8249-kube-api-access-745mj\") pod \"0d42fee4-b8c7-4c37-9494-cac96a9a8249\" (UID: \"0d42fee4-b8c7-4c37-9494-cac96a9a8249\") " Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.152746 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d42fee4-b8c7-4c37-9494-cac96a9a8249-utilities\") pod \"0d42fee4-b8c7-4c37-9494-cac96a9a8249\" (UID: \"0d42fee4-b8c7-4c37-9494-cac96a9a8249\") " Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.152894 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d42fee4-b8c7-4c37-9494-cac96a9a8249-catalog-content\") pod \"0d42fee4-b8c7-4c37-9494-cac96a9a8249\" (UID: \"0d42fee4-b8c7-4c37-9494-cac96a9a8249\") " Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.154331 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d42fee4-b8c7-4c37-9494-cac96a9a8249-utilities" (OuterVolumeSpecName: "utilities") pod "0d42fee4-b8c7-4c37-9494-cac96a9a8249" (UID: "0d42fee4-b8c7-4c37-9494-cac96a9a8249"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.159104 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d42fee4-b8c7-4c37-9494-cac96a9a8249-kube-api-access-745mj" (OuterVolumeSpecName: "kube-api-access-745mj") pod "0d42fee4-b8c7-4c37-9494-cac96a9a8249" (UID: "0d42fee4-b8c7-4c37-9494-cac96a9a8249"). InnerVolumeSpecName "kube-api-access-745mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.202959 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d42fee4-b8c7-4c37-9494-cac96a9a8249-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d42fee4-b8c7-4c37-9494-cac96a9a8249" (UID: "0d42fee4-b8c7-4c37-9494-cac96a9a8249"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.255291 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d42fee4-b8c7-4c37-9494-cac96a9a8249-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.255322 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d42fee4-b8c7-4c37-9494-cac96a9a8249-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.255332 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-745mj\" (UniqueName: \"kubernetes.io/projected/0d42fee4-b8c7-4c37-9494-cac96a9a8249-kube-api-access-745mj\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.348735 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nr24h"] Mar 18 13:01:15 crc kubenswrapper[4975]: I0318 13:01:15.359831 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nr24h"] Mar 18 13:01:17 crc kubenswrapper[4975]: I0318 13:01:17.050026 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d42fee4-b8c7-4c37-9494-cac96a9a8249" path="/var/lib/kubelet/pods/0d42fee4-b8c7-4c37-9494-cac96a9a8249/volumes" Mar 18 13:01:17 crc kubenswrapper[4975]: I0318 13:01:17.496147 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-79mx7"] Mar 18 13:01:17 crc kubenswrapper[4975]: I0318 13:01:17.496423 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-79mx7" podUID="d30a7455-009a-465c-b5e1-e053639b0cb1" containerName="registry-server" containerID="cri-o://aac3fe9a1f0beaa747fb71ab94ef9f8fd1116b977878e8947e7d4966617a7359" gracePeriod=2 Mar 18 13:01:17 crc kubenswrapper[4975]: I0318 13:01:17.985887 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.054630 4975 generic.go:334] "Generic (PLEG): container finished" podID="d30a7455-009a-465c-b5e1-e053639b0cb1" containerID="aac3fe9a1f0beaa747fb71ab94ef9f8fd1116b977878e8947e7d4966617a7359" exitCode=0 Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.054672 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79mx7" event={"ID":"d30a7455-009a-465c-b5e1-e053639b0cb1","Type":"ContainerDied","Data":"aac3fe9a1f0beaa747fb71ab94ef9f8fd1116b977878e8947e7d4966617a7359"} Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.054699 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79mx7" event={"ID":"d30a7455-009a-465c-b5e1-e053639b0cb1","Type":"ContainerDied","Data":"71b59132a8bbb99c6b2fcb8525d8b3baf693106c5b123d66a4e99c4f2cf98b4b"} Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.054715 4975 scope.go:117] "RemoveContainer" containerID="aac3fe9a1f0beaa747fb71ab94ef9f8fd1116b977878e8947e7d4966617a7359" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.054843 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79mx7" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.088836 4975 scope.go:117] "RemoveContainer" containerID="197601e98af4cb84ec250fa02ce4a91176a8a6de273fccf7d8cfd3de3ae1c41b" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.105545 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d30a7455-009a-465c-b5e1-e053639b0cb1-utilities\") pod \"d30a7455-009a-465c-b5e1-e053639b0cb1\" (UID: \"d30a7455-009a-465c-b5e1-e053639b0cb1\") " Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.105642 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d30a7455-009a-465c-b5e1-e053639b0cb1-catalog-content\") pod \"d30a7455-009a-465c-b5e1-e053639b0cb1\" (UID: \"d30a7455-009a-465c-b5e1-e053639b0cb1\") " Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.105740 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pscb\" (UniqueName: \"kubernetes.io/projected/d30a7455-009a-465c-b5e1-e053639b0cb1-kube-api-access-7pscb\") pod \"d30a7455-009a-465c-b5e1-e053639b0cb1\" (UID: \"d30a7455-009a-465c-b5e1-e053639b0cb1\") " Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.108197 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d30a7455-009a-465c-b5e1-e053639b0cb1-utilities" (OuterVolumeSpecName: "utilities") pod "d30a7455-009a-465c-b5e1-e053639b0cb1" (UID: "d30a7455-009a-465c-b5e1-e053639b0cb1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.112032 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30a7455-009a-465c-b5e1-e053639b0cb1-kube-api-access-7pscb" (OuterVolumeSpecName: "kube-api-access-7pscb") pod "d30a7455-009a-465c-b5e1-e053639b0cb1" (UID: "d30a7455-009a-465c-b5e1-e053639b0cb1"). InnerVolumeSpecName "kube-api-access-7pscb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.114737 4975 scope.go:117] "RemoveContainer" containerID="db9cc847e72067fac5206af8f156635b3024cf109ec3756a0247a0a7fecbc295" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.134304 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d30a7455-009a-465c-b5e1-e053639b0cb1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d30a7455-009a-465c-b5e1-e053639b0cb1" (UID: "d30a7455-009a-465c-b5e1-e053639b0cb1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.197031 4975 scope.go:117] "RemoveContainer" containerID="aac3fe9a1f0beaa747fb71ab94ef9f8fd1116b977878e8947e7d4966617a7359" Mar 18 13:01:18 crc kubenswrapper[4975]: E0318 13:01:18.197518 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aac3fe9a1f0beaa747fb71ab94ef9f8fd1116b977878e8947e7d4966617a7359\": container with ID starting with aac3fe9a1f0beaa747fb71ab94ef9f8fd1116b977878e8947e7d4966617a7359 not found: ID does not exist" containerID="aac3fe9a1f0beaa747fb71ab94ef9f8fd1116b977878e8947e7d4966617a7359" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.197549 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aac3fe9a1f0beaa747fb71ab94ef9f8fd1116b977878e8947e7d4966617a7359"} err="failed to get container status \"aac3fe9a1f0beaa747fb71ab94ef9f8fd1116b977878e8947e7d4966617a7359\": rpc error: code = NotFound desc = could not find container \"aac3fe9a1f0beaa747fb71ab94ef9f8fd1116b977878e8947e7d4966617a7359\": container with ID starting with aac3fe9a1f0beaa747fb71ab94ef9f8fd1116b977878e8947e7d4966617a7359 not found: ID does not exist" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.197570 4975 scope.go:117] "RemoveContainer" containerID="197601e98af4cb84ec250fa02ce4a91176a8a6de273fccf7d8cfd3de3ae1c41b" Mar 18 13:01:18 crc kubenswrapper[4975]: E0318 13:01:18.197895 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"197601e98af4cb84ec250fa02ce4a91176a8a6de273fccf7d8cfd3de3ae1c41b\": container with ID starting with 197601e98af4cb84ec250fa02ce4a91176a8a6de273fccf7d8cfd3de3ae1c41b not found: ID does not exist" containerID="197601e98af4cb84ec250fa02ce4a91176a8a6de273fccf7d8cfd3de3ae1c41b" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.197943 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"197601e98af4cb84ec250fa02ce4a91176a8a6de273fccf7d8cfd3de3ae1c41b"} err="failed to get container status \"197601e98af4cb84ec250fa02ce4a91176a8a6de273fccf7d8cfd3de3ae1c41b\": rpc error: code = NotFound desc = could not find container \"197601e98af4cb84ec250fa02ce4a91176a8a6de273fccf7d8cfd3de3ae1c41b\": container with ID starting with 197601e98af4cb84ec250fa02ce4a91176a8a6de273fccf7d8cfd3de3ae1c41b not found: ID does not exist" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.197969 4975 scope.go:117] "RemoveContainer" containerID="db9cc847e72067fac5206af8f156635b3024cf109ec3756a0247a0a7fecbc295" Mar 18 13:01:18 crc kubenswrapper[4975]: E0318 13:01:18.198276 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db9cc847e72067fac5206af8f156635b3024cf109ec3756a0247a0a7fecbc295\": container with ID starting with db9cc847e72067fac5206af8f156635b3024cf109ec3756a0247a0a7fecbc295 not found: ID does not exist" containerID="db9cc847e72067fac5206af8f156635b3024cf109ec3756a0247a0a7fecbc295" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.198304 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9cc847e72067fac5206af8f156635b3024cf109ec3756a0247a0a7fecbc295"} err="failed to get container status \"db9cc847e72067fac5206af8f156635b3024cf109ec3756a0247a0a7fecbc295\": rpc error: code = NotFound desc = could not find container \"db9cc847e72067fac5206af8f156635b3024cf109ec3756a0247a0a7fecbc295\": container with ID starting with db9cc847e72067fac5206af8f156635b3024cf109ec3756a0247a0a7fecbc295 not found: ID does not exist" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.208558 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d30a7455-009a-465c-b5e1-e053639b0cb1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.208609 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d30a7455-009a-465c-b5e1-e053639b0cb1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.208623 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pscb\" (UniqueName: \"kubernetes.io/projected/d30a7455-009a-465c-b5e1-e053639b0cb1-kube-api-access-7pscb\") on node \"crc\" DevicePath \"\"" Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.410689 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-79mx7"] Mar 18 13:01:18 crc kubenswrapper[4975]: I0318 13:01:18.422085 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-79mx7"] Mar 18 13:01:19 crc kubenswrapper[4975]: I0318 13:01:19.032636 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d30a7455-009a-465c-b5e1-e053639b0cb1" path="/var/lib/kubelet/pods/d30a7455-009a-465c-b5e1-e053639b0cb1/volumes" Mar 18 13:01:21 crc kubenswrapper[4975]: I0318 13:01:21.017243 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:01:21 crc kubenswrapper[4975]: E0318 13:01:21.017752 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:01:33 crc kubenswrapper[4975]: I0318 13:01:33.017035 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:01:33 crc kubenswrapper[4975]: E0318 13:01:33.017726 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:01:48 crc kubenswrapper[4975]: I0318 13:01:48.018021 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:01:48 crc kubenswrapper[4975]: E0318 13:01:48.019283 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.318068 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t292k"] Mar 18 13:01:55 crc kubenswrapper[4975]: E0318 13:01:55.318812 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d42fee4-b8c7-4c37-9494-cac96a9a8249" containerName="extract-utilities" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.318831 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d42fee4-b8c7-4c37-9494-cac96a9a8249" containerName="extract-utilities" Mar 18 13:01:55 crc kubenswrapper[4975]: E0318 13:01:55.318880 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30a7455-009a-465c-b5e1-e053639b0cb1" containerName="extract-utilities" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.318889 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30a7455-009a-465c-b5e1-e053639b0cb1" containerName="extract-utilities" Mar 18 13:01:55 crc kubenswrapper[4975]: E0318 13:01:55.318903 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d42fee4-b8c7-4c37-9494-cac96a9a8249" containerName="registry-server" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.318910 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d42fee4-b8c7-4c37-9494-cac96a9a8249" containerName="registry-server" Mar 18 13:01:55 crc kubenswrapper[4975]: E0318 13:01:55.318926 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b92e883-e662-4970-ab9b-df31247d4cb7" containerName="keystone-cron" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.318933 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b92e883-e662-4970-ab9b-df31247d4cb7" containerName="keystone-cron" Mar 18 13:01:55 crc kubenswrapper[4975]: E0318 13:01:55.318959 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30a7455-009a-465c-b5e1-e053639b0cb1" containerName="registry-server" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.318968 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30a7455-009a-465c-b5e1-e053639b0cb1" containerName="registry-server" Mar 18 13:01:55 crc kubenswrapper[4975]: E0318 13:01:55.318983 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30a7455-009a-465c-b5e1-e053639b0cb1" containerName="extract-content" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.318991 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30a7455-009a-465c-b5e1-e053639b0cb1" containerName="extract-content" Mar 18 13:01:55 crc kubenswrapper[4975]: E0318 13:01:55.319005 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d42fee4-b8c7-4c37-9494-cac96a9a8249" containerName="extract-content" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.319013 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d42fee4-b8c7-4c37-9494-cac96a9a8249" containerName="extract-content" Mar 18 13:01:55 crc kubenswrapper[4975]: E0318 13:01:55.319023 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec93f0f6-3753-4db2-a239-859abb202fdc" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.319031 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec93f0f6-3753-4db2-a239-859abb202fdc" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.319252 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d42fee4-b8c7-4c37-9494-cac96a9a8249" containerName="registry-server" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.319276 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec93f0f6-3753-4db2-a239-859abb202fdc" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.319293 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30a7455-009a-465c-b5e1-e053639b0cb1" containerName="registry-server" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.319308 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b92e883-e662-4970-ab9b-df31247d4cb7" containerName="keystone-cron" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.320969 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.327450 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t292k"] Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.340421 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e591b279-0405-4689-80ce-5a6232f00235-utilities\") pod \"certified-operators-t292k\" (UID: \"e591b279-0405-4689-80ce-5a6232f00235\") " pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.340481 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzwct\" (UniqueName: \"kubernetes.io/projected/e591b279-0405-4689-80ce-5a6232f00235-kube-api-access-mzwct\") pod \"certified-operators-t292k\" (UID: \"e591b279-0405-4689-80ce-5a6232f00235\") " pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.340627 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e591b279-0405-4689-80ce-5a6232f00235-catalog-content\") pod \"certified-operators-t292k\" (UID: \"e591b279-0405-4689-80ce-5a6232f00235\") " pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.441484 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e591b279-0405-4689-80ce-5a6232f00235-utilities\") pod \"certified-operators-t292k\" (UID: \"e591b279-0405-4689-80ce-5a6232f00235\") " pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.441729 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzwct\" (UniqueName: \"kubernetes.io/projected/e591b279-0405-4689-80ce-5a6232f00235-kube-api-access-mzwct\") pod \"certified-operators-t292k\" (UID: \"e591b279-0405-4689-80ce-5a6232f00235\") " pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.441912 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e591b279-0405-4689-80ce-5a6232f00235-catalog-content\") pod \"certified-operators-t292k\" (UID: \"e591b279-0405-4689-80ce-5a6232f00235\") " pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.442533 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e591b279-0405-4689-80ce-5a6232f00235-catalog-content\") pod \"certified-operators-t292k\" (UID: \"e591b279-0405-4689-80ce-5a6232f00235\") " pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.442564 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e591b279-0405-4689-80ce-5a6232f00235-utilities\") pod \"certified-operators-t292k\" (UID: \"e591b279-0405-4689-80ce-5a6232f00235\") " pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.461205 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzwct\" (UniqueName: \"kubernetes.io/projected/e591b279-0405-4689-80ce-5a6232f00235-kube-api-access-mzwct\") pod \"certified-operators-t292k\" (UID: \"e591b279-0405-4689-80ce-5a6232f00235\") " pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:01:55 crc kubenswrapper[4975]: I0318 13:01:55.640615 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:01:56 crc kubenswrapper[4975]: I0318 13:01:56.147189 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t292k"] Mar 18 13:01:56 crc kubenswrapper[4975]: I0318 13:01:56.416128 4975 generic.go:334] "Generic (PLEG): container finished" podID="e591b279-0405-4689-80ce-5a6232f00235" containerID="4f5bc383d7a16c8d2209b44cee2f5c53bc559fbf926c59d8b2881c665e925928" exitCode=0 Mar 18 13:01:56 crc kubenswrapper[4975]: I0318 13:01:56.416173 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t292k" event={"ID":"e591b279-0405-4689-80ce-5a6232f00235","Type":"ContainerDied","Data":"4f5bc383d7a16c8d2209b44cee2f5c53bc559fbf926c59d8b2881c665e925928"} Mar 18 13:01:56 crc kubenswrapper[4975]: I0318 13:01:56.416211 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t292k" event={"ID":"e591b279-0405-4689-80ce-5a6232f00235","Type":"ContainerStarted","Data":"e4f3719016d1c7e9ffbe6294bda70176e9987605cfbef1d6d3fed12e6983621c"} Mar 18 13:01:58 crc kubenswrapper[4975]: I0318 13:01:58.434671 4975 generic.go:334] "Generic (PLEG): container finished" podID="e591b279-0405-4689-80ce-5a6232f00235" containerID="3a31738742e44c27f8b1ae0b67c3bd94e33e352387f904338e2cfe28b26b4f9d" exitCode=0 Mar 18 13:01:58 crc kubenswrapper[4975]: I0318 13:01:58.434922 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t292k" event={"ID":"e591b279-0405-4689-80ce-5a6232f00235","Type":"ContainerDied","Data":"3a31738742e44c27f8b1ae0b67c3bd94e33e352387f904338e2cfe28b26b4f9d"} Mar 18 13:01:58 crc kubenswrapper[4975]: I0318 13:01:58.438228 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:01:59 crc kubenswrapper[4975]: I0318 13:01:59.016958 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:01:59 crc kubenswrapper[4975]: E0318 13:01:59.017451 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:01:59 crc kubenswrapper[4975]: I0318 13:01:59.447182 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t292k" event={"ID":"e591b279-0405-4689-80ce-5a6232f00235","Type":"ContainerStarted","Data":"b1aab0ee856275768938c423064475980cf22c4193c890ef42307399198c26e4"} Mar 18 13:01:59 crc kubenswrapper[4975]: I0318 13:01:59.474288 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t292k" podStartSLOduration=1.914154674 podStartE2EDuration="4.474268334s" podCreationTimestamp="2026-03-18 13:01:55 +0000 UTC" firstStartedPulling="2026-03-18 13:01:56.418501762 +0000 UTC m=+3102.132902341" lastFinishedPulling="2026-03-18 13:01:58.978615422 +0000 UTC m=+3104.693016001" observedRunningTime="2026-03-18 13:01:59.465703541 +0000 UTC m=+3105.180104120" watchObservedRunningTime="2026-03-18 13:01:59.474268334 +0000 UTC m=+3105.188668913" Mar 18 13:02:00 crc kubenswrapper[4975]: I0318 13:02:00.146773 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563982-k9f8w"] Mar 18 13:02:00 crc kubenswrapper[4975]: I0318 13:02:00.148124 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563982-k9f8w" Mar 18 13:02:00 crc kubenswrapper[4975]: I0318 13:02:00.152646 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:02:00 crc kubenswrapper[4975]: I0318 13:02:00.152834 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:02:00 crc kubenswrapper[4975]: I0318 13:02:00.152901 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:02:00 crc kubenswrapper[4975]: I0318 13:02:00.162668 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563982-k9f8w"] Mar 18 13:02:00 crc kubenswrapper[4975]: I0318 13:02:00.243535 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9prs\" (UniqueName: \"kubernetes.io/projected/e3c8c176-21ac-4505-b1f6-31de8ddb3989-kube-api-access-q9prs\") pod \"auto-csr-approver-29563982-k9f8w\" (UID: \"e3c8c176-21ac-4505-b1f6-31de8ddb3989\") " pod="openshift-infra/auto-csr-approver-29563982-k9f8w" Mar 18 13:02:00 crc kubenswrapper[4975]: I0318 13:02:00.344903 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9prs\" (UniqueName: \"kubernetes.io/projected/e3c8c176-21ac-4505-b1f6-31de8ddb3989-kube-api-access-q9prs\") pod \"auto-csr-approver-29563982-k9f8w\" (UID: \"e3c8c176-21ac-4505-b1f6-31de8ddb3989\") " pod="openshift-infra/auto-csr-approver-29563982-k9f8w" Mar 18 13:02:00 crc kubenswrapper[4975]: I0318 13:02:00.370171 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9prs\" (UniqueName: \"kubernetes.io/projected/e3c8c176-21ac-4505-b1f6-31de8ddb3989-kube-api-access-q9prs\") pod \"auto-csr-approver-29563982-k9f8w\" (UID: \"e3c8c176-21ac-4505-b1f6-31de8ddb3989\") " pod="openshift-infra/auto-csr-approver-29563982-k9f8w" Mar 18 13:02:00 crc kubenswrapper[4975]: I0318 13:02:00.465621 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563982-k9f8w" Mar 18 13:02:00 crc kubenswrapper[4975]: I0318 13:02:00.954350 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563982-k9f8w"] Mar 18 13:02:00 crc kubenswrapper[4975]: W0318 13:02:00.964498 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3c8c176_21ac_4505_b1f6_31de8ddb3989.slice/crio-461e12c0c1e2afd3822cea97b5337fd015a5e46a8eb21f9d04f3e9e30bd4f1a8 WatchSource:0}: Error finding container 461e12c0c1e2afd3822cea97b5337fd015a5e46a8eb21f9d04f3e9e30bd4f1a8: Status 404 returned error can't find the container with id 461e12c0c1e2afd3822cea97b5337fd015a5e46a8eb21f9d04f3e9e30bd4f1a8 Mar 18 13:02:01 crc kubenswrapper[4975]: I0318 13:02:01.467629 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563982-k9f8w" event={"ID":"e3c8c176-21ac-4505-b1f6-31de8ddb3989","Type":"ContainerStarted","Data":"461e12c0c1e2afd3822cea97b5337fd015a5e46a8eb21f9d04f3e9e30bd4f1a8"} Mar 18 13:02:02 crc kubenswrapper[4975]: I0318 13:02:02.478719 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563982-k9f8w" event={"ID":"e3c8c176-21ac-4505-b1f6-31de8ddb3989","Type":"ContainerStarted","Data":"6b430f6f47c2cf504908eef61dd90288c1277f765b5341ade428fb42dc9b19f3"} Mar 18 13:02:02 crc kubenswrapper[4975]: I0318 13:02:02.502698 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563982-k9f8w" podStartSLOduration=1.290818872 podStartE2EDuration="2.502671591s" podCreationTimestamp="2026-03-18 13:02:00 +0000 UTC" firstStartedPulling="2026-03-18 13:02:00.967509611 +0000 UTC m=+3106.681910180" lastFinishedPulling="2026-03-18 13:02:02.17936228 +0000 UTC m=+3107.893762899" observedRunningTime="2026-03-18 13:02:02.500171462 +0000 UTC m=+3108.214572041" watchObservedRunningTime="2026-03-18 13:02:02.502671591 +0000 UTC m=+3108.217072200" Mar 18 13:02:03 crc kubenswrapper[4975]: I0318 13:02:03.499100 4975 generic.go:334] "Generic (PLEG): container finished" podID="e3c8c176-21ac-4505-b1f6-31de8ddb3989" containerID="6b430f6f47c2cf504908eef61dd90288c1277f765b5341ade428fb42dc9b19f3" exitCode=0 Mar 18 13:02:03 crc kubenswrapper[4975]: I0318 13:02:03.499205 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563982-k9f8w" event={"ID":"e3c8c176-21ac-4505-b1f6-31de8ddb3989","Type":"ContainerDied","Data":"6b430f6f47c2cf504908eef61dd90288c1277f765b5341ade428fb42dc9b19f3"} Mar 18 13:02:04 crc kubenswrapper[4975]: I0318 13:02:04.910693 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563982-k9f8w" Mar 18 13:02:04 crc kubenswrapper[4975]: I0318 13:02:04.938236 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9prs\" (UniqueName: \"kubernetes.io/projected/e3c8c176-21ac-4505-b1f6-31de8ddb3989-kube-api-access-q9prs\") pod \"e3c8c176-21ac-4505-b1f6-31de8ddb3989\" (UID: \"e3c8c176-21ac-4505-b1f6-31de8ddb3989\") " Mar 18 13:02:04 crc kubenswrapper[4975]: I0318 13:02:04.944456 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c8c176-21ac-4505-b1f6-31de8ddb3989-kube-api-access-q9prs" (OuterVolumeSpecName: "kube-api-access-q9prs") pod "e3c8c176-21ac-4505-b1f6-31de8ddb3989" (UID: "e3c8c176-21ac-4505-b1f6-31de8ddb3989"). InnerVolumeSpecName "kube-api-access-q9prs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:02:05 crc kubenswrapper[4975]: I0318 13:02:05.041099 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9prs\" (UniqueName: \"kubernetes.io/projected/e3c8c176-21ac-4505-b1f6-31de8ddb3989-kube-api-access-q9prs\") on node \"crc\" DevicePath \"\"" Mar 18 13:02:05 crc kubenswrapper[4975]: I0318 13:02:05.524843 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563982-k9f8w" event={"ID":"e3c8c176-21ac-4505-b1f6-31de8ddb3989","Type":"ContainerDied","Data":"461e12c0c1e2afd3822cea97b5337fd015a5e46a8eb21f9d04f3e9e30bd4f1a8"} Mar 18 13:02:05 crc kubenswrapper[4975]: I0318 13:02:05.525159 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="461e12c0c1e2afd3822cea97b5337fd015a5e46a8eb21f9d04f3e9e30bd4f1a8" Mar 18 13:02:05 crc kubenswrapper[4975]: I0318 13:02:05.524898 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563982-k9f8w" Mar 18 13:02:05 crc kubenswrapper[4975]: I0318 13:02:05.568112 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563976-2dbcd"] Mar 18 13:02:05 crc kubenswrapper[4975]: I0318 13:02:05.575408 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563976-2dbcd"] Mar 18 13:02:05 crc kubenswrapper[4975]: I0318 13:02:05.641530 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:02:05 crc kubenswrapper[4975]: I0318 13:02:05.641563 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:02:05 crc kubenswrapper[4975]: I0318 13:02:05.689912 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:02:06 crc kubenswrapper[4975]: I0318 13:02:06.624183 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:02:06 crc kubenswrapper[4975]: I0318 13:02:06.679447 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t292k"] Mar 18 13:02:07 crc kubenswrapper[4975]: I0318 13:02:07.026214 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea" path="/var/lib/kubelet/pods/ccb66d3e-3be8-4208-8e0b-0fbc50cf26ea/volumes" Mar 18 13:02:08 crc kubenswrapper[4975]: I0318 13:02:08.555694 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t292k" podUID="e591b279-0405-4689-80ce-5a6232f00235" containerName="registry-server" containerID="cri-o://b1aab0ee856275768938c423064475980cf22c4193c890ef42307399198c26e4" gracePeriod=2 Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.009746 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.129555 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e591b279-0405-4689-80ce-5a6232f00235-utilities\") pod \"e591b279-0405-4689-80ce-5a6232f00235\" (UID: \"e591b279-0405-4689-80ce-5a6232f00235\") " Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.129620 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e591b279-0405-4689-80ce-5a6232f00235-catalog-content\") pod \"e591b279-0405-4689-80ce-5a6232f00235\" (UID: \"e591b279-0405-4689-80ce-5a6232f00235\") " Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.129740 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzwct\" (UniqueName: \"kubernetes.io/projected/e591b279-0405-4689-80ce-5a6232f00235-kube-api-access-mzwct\") pod \"e591b279-0405-4689-80ce-5a6232f00235\" (UID: \"e591b279-0405-4689-80ce-5a6232f00235\") " Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.130796 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e591b279-0405-4689-80ce-5a6232f00235-utilities" (OuterVolumeSpecName: "utilities") pod "e591b279-0405-4689-80ce-5a6232f00235" (UID: "e591b279-0405-4689-80ce-5a6232f00235"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.137028 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e591b279-0405-4689-80ce-5a6232f00235-kube-api-access-mzwct" (OuterVolumeSpecName: "kube-api-access-mzwct") pod "e591b279-0405-4689-80ce-5a6232f00235" (UID: "e591b279-0405-4689-80ce-5a6232f00235"). InnerVolumeSpecName "kube-api-access-mzwct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.180267 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e591b279-0405-4689-80ce-5a6232f00235-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e591b279-0405-4689-80ce-5a6232f00235" (UID: "e591b279-0405-4689-80ce-5a6232f00235"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.232428 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e591b279-0405-4689-80ce-5a6232f00235-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.232468 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e591b279-0405-4689-80ce-5a6232f00235-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.232478 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzwct\" (UniqueName: \"kubernetes.io/projected/e591b279-0405-4689-80ce-5a6232f00235-kube-api-access-mzwct\") on node \"crc\" DevicePath \"\"" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.567069 4975 generic.go:334] "Generic (PLEG): container finished" podID="e591b279-0405-4689-80ce-5a6232f00235" containerID="b1aab0ee856275768938c423064475980cf22c4193c890ef42307399198c26e4" exitCode=0 Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.567127 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t292k" event={"ID":"e591b279-0405-4689-80ce-5a6232f00235","Type":"ContainerDied","Data":"b1aab0ee856275768938c423064475980cf22c4193c890ef42307399198c26e4"} Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.567170 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t292k" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.567206 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t292k" event={"ID":"e591b279-0405-4689-80ce-5a6232f00235","Type":"ContainerDied","Data":"e4f3719016d1c7e9ffbe6294bda70176e9987605cfbef1d6d3fed12e6983621c"} Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.567239 4975 scope.go:117] "RemoveContainer" containerID="b1aab0ee856275768938c423064475980cf22c4193c890ef42307399198c26e4" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.597789 4975 scope.go:117] "RemoveContainer" containerID="3a31738742e44c27f8b1ae0b67c3bd94e33e352387f904338e2cfe28b26b4f9d" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.630801 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t292k"] Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.632730 4975 scope.go:117] "RemoveContainer" containerID="4f5bc383d7a16c8d2209b44cee2f5c53bc559fbf926c59d8b2881c665e925928" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.642829 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t292k"] Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.695249 4975 scope.go:117] "RemoveContainer" containerID="b1aab0ee856275768938c423064475980cf22c4193c890ef42307399198c26e4" Mar 18 13:02:09 crc kubenswrapper[4975]: E0318 13:02:09.696563 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1aab0ee856275768938c423064475980cf22c4193c890ef42307399198c26e4\": container with ID starting with b1aab0ee856275768938c423064475980cf22c4193c890ef42307399198c26e4 not found: ID does not exist" containerID="b1aab0ee856275768938c423064475980cf22c4193c890ef42307399198c26e4" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.696636 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1aab0ee856275768938c423064475980cf22c4193c890ef42307399198c26e4"} err="failed to get container status \"b1aab0ee856275768938c423064475980cf22c4193c890ef42307399198c26e4\": rpc error: code = NotFound desc = could not find container \"b1aab0ee856275768938c423064475980cf22c4193c890ef42307399198c26e4\": container with ID starting with b1aab0ee856275768938c423064475980cf22c4193c890ef42307399198c26e4 not found: ID does not exist" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.696682 4975 scope.go:117] "RemoveContainer" containerID="3a31738742e44c27f8b1ae0b67c3bd94e33e352387f904338e2cfe28b26b4f9d" Mar 18 13:02:09 crc kubenswrapper[4975]: E0318 13:02:09.697230 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a31738742e44c27f8b1ae0b67c3bd94e33e352387f904338e2cfe28b26b4f9d\": container with ID starting with 3a31738742e44c27f8b1ae0b67c3bd94e33e352387f904338e2cfe28b26b4f9d not found: ID does not exist" containerID="3a31738742e44c27f8b1ae0b67c3bd94e33e352387f904338e2cfe28b26b4f9d" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.697285 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a31738742e44c27f8b1ae0b67c3bd94e33e352387f904338e2cfe28b26b4f9d"} err="failed to get container status \"3a31738742e44c27f8b1ae0b67c3bd94e33e352387f904338e2cfe28b26b4f9d\": rpc error: code = NotFound desc = could not find container \"3a31738742e44c27f8b1ae0b67c3bd94e33e352387f904338e2cfe28b26b4f9d\": container with ID starting with 3a31738742e44c27f8b1ae0b67c3bd94e33e352387f904338e2cfe28b26b4f9d not found: ID does not exist" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.697319 4975 scope.go:117] "RemoveContainer" containerID="4f5bc383d7a16c8d2209b44cee2f5c53bc559fbf926c59d8b2881c665e925928" Mar 18 13:02:09 crc kubenswrapper[4975]: E0318 13:02:09.697812 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f5bc383d7a16c8d2209b44cee2f5c53bc559fbf926c59d8b2881c665e925928\": container with ID starting with 4f5bc383d7a16c8d2209b44cee2f5c53bc559fbf926c59d8b2881c665e925928 not found: ID does not exist" containerID="4f5bc383d7a16c8d2209b44cee2f5c53bc559fbf926c59d8b2881c665e925928" Mar 18 13:02:09 crc kubenswrapper[4975]: I0318 13:02:09.697855 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5bc383d7a16c8d2209b44cee2f5c53bc559fbf926c59d8b2881c665e925928"} err="failed to get container status \"4f5bc383d7a16c8d2209b44cee2f5c53bc559fbf926c59d8b2881c665e925928\": rpc error: code = NotFound desc = could not find container \"4f5bc383d7a16c8d2209b44cee2f5c53bc559fbf926c59d8b2881c665e925928\": container with ID starting with 4f5bc383d7a16c8d2209b44cee2f5c53bc559fbf926c59d8b2881c665e925928 not found: ID does not exist" Mar 18 13:02:11 crc kubenswrapper[4975]: I0318 13:02:11.033371 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e591b279-0405-4689-80ce-5a6232f00235" path="/var/lib/kubelet/pods/e591b279-0405-4689-80ce-5a6232f00235/volumes" Mar 18 13:02:14 crc kubenswrapper[4975]: I0318 13:02:14.017176 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:02:14 crc kubenswrapper[4975]: E0318 13:02:14.017937 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.032806 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs"] Mar 18 13:02:22 crc kubenswrapper[4975]: E0318 13:02:22.033756 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e591b279-0405-4689-80ce-5a6232f00235" containerName="registry-server" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.033772 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="e591b279-0405-4689-80ce-5a6232f00235" containerName="registry-server" Mar 18 13:02:22 crc kubenswrapper[4975]: E0318 13:02:22.033788 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e591b279-0405-4689-80ce-5a6232f00235" containerName="extract-content" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.033795 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="e591b279-0405-4689-80ce-5a6232f00235" containerName="extract-content" Mar 18 13:02:22 crc kubenswrapper[4975]: E0318 13:02:22.033814 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c8c176-21ac-4505-b1f6-31de8ddb3989" containerName="oc" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.033822 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c8c176-21ac-4505-b1f6-31de8ddb3989" containerName="oc" Mar 18 13:02:22 crc kubenswrapper[4975]: E0318 13:02:22.033840 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e591b279-0405-4689-80ce-5a6232f00235" containerName="extract-utilities" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.033849 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="e591b279-0405-4689-80ce-5a6232f00235" containerName="extract-utilities" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.034643 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c8c176-21ac-4505-b1f6-31de8ddb3989" containerName="oc" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.034724 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="e591b279-0405-4689-80ce-5a6232f00235" containerName="registry-server" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.035722 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.037559 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.037603 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.038443 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.038478 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.038755 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.047955 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs"] Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.083114 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52whs\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.083195 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52whs\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.083314 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52whs\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.083336 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxcjr\" (UniqueName: \"kubernetes.io/projected/c5353d60-ff65-4a67-a566-00ef9a757cfb-kube-api-access-vxcjr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52whs\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.083361 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52whs\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.185026 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52whs\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.185256 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52whs\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.185423 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52whs\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.185460 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxcjr\" (UniqueName: \"kubernetes.io/projected/c5353d60-ff65-4a67-a566-00ef9a757cfb-kube-api-access-vxcjr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52whs\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.185524 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52whs\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.197789 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52whs\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.197899 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52whs\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.197969 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52whs\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.198221 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52whs\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.202293 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxcjr\" (UniqueName: \"kubernetes.io/projected/c5353d60-ff65-4a67-a566-00ef9a757cfb-kube-api-access-vxcjr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52whs\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.369908 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:02:22 crc kubenswrapper[4975]: I0318 13:02:22.902829 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs"] Mar 18 13:02:23 crc kubenswrapper[4975]: I0318 13:02:23.720762 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" event={"ID":"c5353d60-ff65-4a67-a566-00ef9a757cfb","Type":"ContainerStarted","Data":"fb1abfc582c9ae45ae19201d607e4d9e2a41fba200bbaaa9be3501273a7ddf09"} Mar 18 13:02:23 crc kubenswrapper[4975]: I0318 13:02:23.721137 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" event={"ID":"c5353d60-ff65-4a67-a566-00ef9a757cfb","Type":"ContainerStarted","Data":"6e398807adfd475317cd23a7c0cf28bf3e7376adce3bb099f5e0b2e045dd7d96"} Mar 18 13:02:23 crc kubenswrapper[4975]: I0318 13:02:23.738951 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" podStartSLOduration=1.233331143 podStartE2EDuration="1.738933557s" podCreationTimestamp="2026-03-18 13:02:22 +0000 UTC" firstStartedPulling="2026-03-18 13:02:22.910386202 +0000 UTC m=+3128.624786781" lastFinishedPulling="2026-03-18 13:02:23.415988616 +0000 UTC m=+3129.130389195" observedRunningTime="2026-03-18 13:02:23.73427738 +0000 UTC m=+3129.448678009" watchObservedRunningTime="2026-03-18 13:02:23.738933557 +0000 UTC m=+3129.453334136" Mar 18 13:02:27 crc kubenswrapper[4975]: I0318 13:02:27.016770 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:02:27 crc kubenswrapper[4975]: E0318 13:02:27.017360 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:02:41 crc kubenswrapper[4975]: I0318 13:02:41.066765 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:02:41 crc kubenswrapper[4975]: E0318 13:02:41.067848 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:02:48 crc kubenswrapper[4975]: I0318 13:02:48.338482 4975 scope.go:117] "RemoveContainer" containerID="6105825e8f538b7cd6cb7329ba3b6bf68588699dd26992d82765640f2071d46f" Mar 18 13:02:56 crc kubenswrapper[4975]: I0318 13:02:56.017857 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:02:56 crc kubenswrapper[4975]: E0318 13:02:56.019352 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:03:09 crc kubenswrapper[4975]: I0318 13:03:09.016508 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:03:09 crc kubenswrapper[4975]: E0318 13:03:09.017305 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:03:11 crc kubenswrapper[4975]: I0318 13:03:11.442340 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wzcx2"] Mar 18 13:03:11 crc kubenswrapper[4975]: I0318 13:03:11.445845 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:11 crc kubenswrapper[4975]: I0318 13:03:11.462300 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wzcx2"] Mar 18 13:03:11 crc kubenswrapper[4975]: I0318 13:03:11.543316 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb55997d-1ca3-4e66-a980-26ca7840ee2d-utilities\") pod \"redhat-operators-wzcx2\" (UID: \"fb55997d-1ca3-4e66-a980-26ca7840ee2d\") " pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:11 crc kubenswrapper[4975]: I0318 13:03:11.543438 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb55997d-1ca3-4e66-a980-26ca7840ee2d-catalog-content\") pod \"redhat-operators-wzcx2\" (UID: \"fb55997d-1ca3-4e66-a980-26ca7840ee2d\") " pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:11 crc kubenswrapper[4975]: I0318 13:03:11.543676 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdqw8\" (UniqueName: \"kubernetes.io/projected/fb55997d-1ca3-4e66-a980-26ca7840ee2d-kube-api-access-pdqw8\") pod \"redhat-operators-wzcx2\" (UID: \"fb55997d-1ca3-4e66-a980-26ca7840ee2d\") " pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:11 crc kubenswrapper[4975]: I0318 13:03:11.645356 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb55997d-1ca3-4e66-a980-26ca7840ee2d-utilities\") pod \"redhat-operators-wzcx2\" (UID: \"fb55997d-1ca3-4e66-a980-26ca7840ee2d\") " pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:11 crc kubenswrapper[4975]: I0318 13:03:11.645656 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb55997d-1ca3-4e66-a980-26ca7840ee2d-catalog-content\") pod \"redhat-operators-wzcx2\" (UID: \"fb55997d-1ca3-4e66-a980-26ca7840ee2d\") " pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:11 crc kubenswrapper[4975]: I0318 13:03:11.645854 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdqw8\" (UniqueName: \"kubernetes.io/projected/fb55997d-1ca3-4e66-a980-26ca7840ee2d-kube-api-access-pdqw8\") pod \"redhat-operators-wzcx2\" (UID: \"fb55997d-1ca3-4e66-a980-26ca7840ee2d\") " pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:11 crc kubenswrapper[4975]: I0318 13:03:11.646077 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb55997d-1ca3-4e66-a980-26ca7840ee2d-utilities\") pod \"redhat-operators-wzcx2\" (UID: \"fb55997d-1ca3-4e66-a980-26ca7840ee2d\") " pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:11 crc kubenswrapper[4975]: I0318 13:03:11.646086 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb55997d-1ca3-4e66-a980-26ca7840ee2d-catalog-content\") pod \"redhat-operators-wzcx2\" (UID: \"fb55997d-1ca3-4e66-a980-26ca7840ee2d\") " pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:11 crc kubenswrapper[4975]: I0318 13:03:11.665858 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdqw8\" (UniqueName: \"kubernetes.io/projected/fb55997d-1ca3-4e66-a980-26ca7840ee2d-kube-api-access-pdqw8\") pod \"redhat-operators-wzcx2\" (UID: \"fb55997d-1ca3-4e66-a980-26ca7840ee2d\") " pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:11 crc kubenswrapper[4975]: I0318 13:03:11.771052 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:12 crc kubenswrapper[4975]: I0318 13:03:12.226768 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wzcx2"] Mar 18 13:03:12 crc kubenswrapper[4975]: I0318 13:03:12.254833 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wzcx2" event={"ID":"fb55997d-1ca3-4e66-a980-26ca7840ee2d","Type":"ContainerStarted","Data":"ca8aaa15ee40611afe6889696c3579a819ea6c81f11dc379dd829035b7eaff15"} Mar 18 13:03:13 crc kubenswrapper[4975]: I0318 13:03:13.268088 4975 generic.go:334] "Generic (PLEG): container finished" podID="fb55997d-1ca3-4e66-a980-26ca7840ee2d" containerID="b938e8eec273364a0d9c6c1e5a419c2b6e4fea5e833632dd9cd62d782bbc946f" exitCode=0 Mar 18 13:03:13 crc kubenswrapper[4975]: I0318 13:03:13.268145 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wzcx2" event={"ID":"fb55997d-1ca3-4e66-a980-26ca7840ee2d","Type":"ContainerDied","Data":"b938e8eec273364a0d9c6c1e5a419c2b6e4fea5e833632dd9cd62d782bbc946f"} Mar 18 13:03:15 crc kubenswrapper[4975]: I0318 13:03:15.311131 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wzcx2" event={"ID":"fb55997d-1ca3-4e66-a980-26ca7840ee2d","Type":"ContainerStarted","Data":"20d3b2f268d170ff0f16219f2a54c3db56425f46c126680a6fda892109343642"} Mar 18 13:03:17 crc kubenswrapper[4975]: I0318 13:03:17.332692 4975 generic.go:334] "Generic (PLEG): container finished" podID="fb55997d-1ca3-4e66-a980-26ca7840ee2d" containerID="20d3b2f268d170ff0f16219f2a54c3db56425f46c126680a6fda892109343642" exitCode=0 Mar 18 13:03:17 crc kubenswrapper[4975]: I0318 13:03:17.332776 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wzcx2" event={"ID":"fb55997d-1ca3-4e66-a980-26ca7840ee2d","Type":"ContainerDied","Data":"20d3b2f268d170ff0f16219f2a54c3db56425f46c126680a6fda892109343642"} Mar 18 13:03:18 crc kubenswrapper[4975]: I0318 13:03:18.347134 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wzcx2" event={"ID":"fb55997d-1ca3-4e66-a980-26ca7840ee2d","Type":"ContainerStarted","Data":"37c26cb0fadc3b82709043e02cdd18b7e16b091c1987ebcf595429fcc81ef7a3"} Mar 18 13:03:18 crc kubenswrapper[4975]: I0318 13:03:18.376153 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wzcx2" podStartSLOduration=2.924950289 podStartE2EDuration="7.376106965s" podCreationTimestamp="2026-03-18 13:03:11 +0000 UTC" firstStartedPulling="2026-03-18 13:03:13.26999424 +0000 UTC m=+3178.984394819" lastFinishedPulling="2026-03-18 13:03:17.721150906 +0000 UTC m=+3183.435551495" observedRunningTime="2026-03-18 13:03:18.370488742 +0000 UTC m=+3184.084889321" watchObservedRunningTime="2026-03-18 13:03:18.376106965 +0000 UTC m=+3184.090507544" Mar 18 13:03:20 crc kubenswrapper[4975]: I0318 13:03:20.016336 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:03:20 crc kubenswrapper[4975]: E0318 13:03:20.017133 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:03:21 crc kubenswrapper[4975]: I0318 13:03:21.771706 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:21 crc kubenswrapper[4975]: I0318 13:03:21.771774 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:22 crc kubenswrapper[4975]: I0318 13:03:22.824468 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wzcx2" podUID="fb55997d-1ca3-4e66-a980-26ca7840ee2d" containerName="registry-server" probeResult="failure" output=< Mar 18 13:03:22 crc kubenswrapper[4975]: timeout: failed to connect service ":50051" within 1s Mar 18 13:03:22 crc kubenswrapper[4975]: > Mar 18 13:03:31 crc kubenswrapper[4975]: I0318 13:03:31.835220 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:31 crc kubenswrapper[4975]: I0318 13:03:31.882773 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:32 crc kubenswrapper[4975]: I0318 13:03:32.071450 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wzcx2"] Mar 18 13:03:33 crc kubenswrapper[4975]: I0318 13:03:33.491844 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wzcx2" podUID="fb55997d-1ca3-4e66-a980-26ca7840ee2d" containerName="registry-server" containerID="cri-o://37c26cb0fadc3b82709043e02cdd18b7e16b091c1987ebcf595429fcc81ef7a3" gracePeriod=2 Mar 18 13:03:34 crc kubenswrapper[4975]: I0318 13:03:34.508975 4975 generic.go:334] "Generic (PLEG): container finished" podID="fb55997d-1ca3-4e66-a980-26ca7840ee2d" containerID="37c26cb0fadc3b82709043e02cdd18b7e16b091c1987ebcf595429fcc81ef7a3" exitCode=0 Mar 18 13:03:34 crc kubenswrapper[4975]: I0318 13:03:34.509101 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wzcx2" event={"ID":"fb55997d-1ca3-4e66-a980-26ca7840ee2d","Type":"ContainerDied","Data":"37c26cb0fadc3b82709043e02cdd18b7e16b091c1987ebcf595429fcc81ef7a3"} Mar 18 13:03:34 crc kubenswrapper[4975]: I0318 13:03:34.509466 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wzcx2" event={"ID":"fb55997d-1ca3-4e66-a980-26ca7840ee2d","Type":"ContainerDied","Data":"ca8aaa15ee40611afe6889696c3579a819ea6c81f11dc379dd829035b7eaff15"} Mar 18 13:03:34 crc kubenswrapper[4975]: I0318 13:03:34.509526 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca8aaa15ee40611afe6889696c3579a819ea6c81f11dc379dd829035b7eaff15" Mar 18 13:03:34 crc kubenswrapper[4975]: I0318 13:03:34.545466 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:34 crc kubenswrapper[4975]: I0318 13:03:34.710249 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdqw8\" (UniqueName: \"kubernetes.io/projected/fb55997d-1ca3-4e66-a980-26ca7840ee2d-kube-api-access-pdqw8\") pod \"fb55997d-1ca3-4e66-a980-26ca7840ee2d\" (UID: \"fb55997d-1ca3-4e66-a980-26ca7840ee2d\") " Mar 18 13:03:34 crc kubenswrapper[4975]: I0318 13:03:34.710468 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb55997d-1ca3-4e66-a980-26ca7840ee2d-catalog-content\") pod \"fb55997d-1ca3-4e66-a980-26ca7840ee2d\" (UID: \"fb55997d-1ca3-4e66-a980-26ca7840ee2d\") " Mar 18 13:03:34 crc kubenswrapper[4975]: I0318 13:03:34.710819 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb55997d-1ca3-4e66-a980-26ca7840ee2d-utilities\") pod \"fb55997d-1ca3-4e66-a980-26ca7840ee2d\" (UID: \"fb55997d-1ca3-4e66-a980-26ca7840ee2d\") " Mar 18 13:03:34 crc kubenswrapper[4975]: I0318 13:03:34.712023 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb55997d-1ca3-4e66-a980-26ca7840ee2d-utilities" (OuterVolumeSpecName: "utilities") pod "fb55997d-1ca3-4e66-a980-26ca7840ee2d" (UID: "fb55997d-1ca3-4e66-a980-26ca7840ee2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:03:34 crc kubenswrapper[4975]: I0318 13:03:34.724675 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb55997d-1ca3-4e66-a980-26ca7840ee2d-kube-api-access-pdqw8" (OuterVolumeSpecName: "kube-api-access-pdqw8") pod "fb55997d-1ca3-4e66-a980-26ca7840ee2d" (UID: "fb55997d-1ca3-4e66-a980-26ca7840ee2d"). InnerVolumeSpecName "kube-api-access-pdqw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:03:34 crc kubenswrapper[4975]: I0318 13:03:34.813589 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdqw8\" (UniqueName: \"kubernetes.io/projected/fb55997d-1ca3-4e66-a980-26ca7840ee2d-kube-api-access-pdqw8\") on node \"crc\" DevicePath \"\"" Mar 18 13:03:34 crc kubenswrapper[4975]: I0318 13:03:34.813629 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb55997d-1ca3-4e66-a980-26ca7840ee2d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:03:34 crc kubenswrapper[4975]: I0318 13:03:34.857679 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb55997d-1ca3-4e66-a980-26ca7840ee2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb55997d-1ca3-4e66-a980-26ca7840ee2d" (UID: "fb55997d-1ca3-4e66-a980-26ca7840ee2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:03:34 crc kubenswrapper[4975]: I0318 13:03:34.916204 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb55997d-1ca3-4e66-a980-26ca7840ee2d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:03:35 crc kubenswrapper[4975]: I0318 13:03:35.024446 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:03:35 crc kubenswrapper[4975]: E0318 13:03:35.024824 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:03:35 crc kubenswrapper[4975]: I0318 13:03:35.521109 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wzcx2" Mar 18 13:03:35 crc kubenswrapper[4975]: I0318 13:03:35.556997 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wzcx2"] Mar 18 13:03:35 crc kubenswrapper[4975]: I0318 13:03:35.587355 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wzcx2"] Mar 18 13:03:37 crc kubenswrapper[4975]: I0318 13:03:37.026352 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb55997d-1ca3-4e66-a980-26ca7840ee2d" path="/var/lib/kubelet/pods/fb55997d-1ca3-4e66-a980-26ca7840ee2d/volumes" Mar 18 13:03:50 crc kubenswrapper[4975]: I0318 13:03:50.016691 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:03:50 crc kubenswrapper[4975]: E0318 13:03:50.017701 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:04:00 crc kubenswrapper[4975]: I0318 13:04:00.179483 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563984-j4rgx"] Mar 18 13:04:00 crc kubenswrapper[4975]: E0318 13:04:00.180585 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb55997d-1ca3-4e66-a980-26ca7840ee2d" containerName="extract-utilities" Mar 18 13:04:00 crc kubenswrapper[4975]: I0318 13:04:00.180603 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb55997d-1ca3-4e66-a980-26ca7840ee2d" containerName="extract-utilities" Mar 18 13:04:00 crc kubenswrapper[4975]: E0318 13:04:00.180636 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb55997d-1ca3-4e66-a980-26ca7840ee2d" containerName="registry-server" Mar 18 13:04:00 crc kubenswrapper[4975]: I0318 13:04:00.180644 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb55997d-1ca3-4e66-a980-26ca7840ee2d" containerName="registry-server" Mar 18 13:04:00 crc kubenswrapper[4975]: E0318 13:04:00.180662 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb55997d-1ca3-4e66-a980-26ca7840ee2d" containerName="extract-content" Mar 18 13:04:00 crc kubenswrapper[4975]: I0318 13:04:00.180669 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb55997d-1ca3-4e66-a980-26ca7840ee2d" containerName="extract-content" Mar 18 13:04:00 crc kubenswrapper[4975]: I0318 13:04:00.180925 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb55997d-1ca3-4e66-a980-26ca7840ee2d" containerName="registry-server" Mar 18 13:04:00 crc kubenswrapper[4975]: I0318 13:04:00.181676 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563984-j4rgx" Mar 18 13:04:00 crc kubenswrapper[4975]: I0318 13:04:00.184792 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:04:00 crc kubenswrapper[4975]: I0318 13:04:00.186108 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:04:00 crc kubenswrapper[4975]: I0318 13:04:00.188221 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:04:00 crc kubenswrapper[4975]: I0318 13:04:00.191651 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563984-j4rgx"] Mar 18 13:04:00 crc kubenswrapper[4975]: I0318 13:04:00.245144 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndsbj\" (UniqueName: \"kubernetes.io/projected/44fdccba-8590-4cbf-9128-93e1738a6909-kube-api-access-ndsbj\") pod \"auto-csr-approver-29563984-j4rgx\" (UID: \"44fdccba-8590-4cbf-9128-93e1738a6909\") " pod="openshift-infra/auto-csr-approver-29563984-j4rgx" Mar 18 13:04:00 crc kubenswrapper[4975]: I0318 13:04:00.346805 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndsbj\" (UniqueName: \"kubernetes.io/projected/44fdccba-8590-4cbf-9128-93e1738a6909-kube-api-access-ndsbj\") pod \"auto-csr-approver-29563984-j4rgx\" (UID: \"44fdccba-8590-4cbf-9128-93e1738a6909\") " pod="openshift-infra/auto-csr-approver-29563984-j4rgx" Mar 18 13:04:00 crc kubenswrapper[4975]: I0318 13:04:00.367100 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndsbj\" (UniqueName: \"kubernetes.io/projected/44fdccba-8590-4cbf-9128-93e1738a6909-kube-api-access-ndsbj\") pod \"auto-csr-approver-29563984-j4rgx\" (UID: \"44fdccba-8590-4cbf-9128-93e1738a6909\") " pod="openshift-infra/auto-csr-approver-29563984-j4rgx" Mar 18 13:04:00 crc kubenswrapper[4975]: I0318 13:04:00.506797 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563984-j4rgx" Mar 18 13:04:00 crc kubenswrapper[4975]: I0318 13:04:00.993166 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563984-j4rgx"] Mar 18 13:04:00 crc kubenswrapper[4975]: W0318 13:04:00.998605 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44fdccba_8590_4cbf_9128_93e1738a6909.slice/crio-97c28668896bf532086e3019a3a47832955ed8d4c3212b6e63c1a33431469a19 WatchSource:0}: Error finding container 97c28668896bf532086e3019a3a47832955ed8d4c3212b6e63c1a33431469a19: Status 404 returned error can't find the container with id 97c28668896bf532086e3019a3a47832955ed8d4c3212b6e63c1a33431469a19 Mar 18 13:04:01 crc kubenswrapper[4975]: I0318 13:04:01.785454 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563984-j4rgx" event={"ID":"44fdccba-8590-4cbf-9128-93e1738a6909","Type":"ContainerStarted","Data":"97c28668896bf532086e3019a3a47832955ed8d4c3212b6e63c1a33431469a19"} Mar 18 13:04:02 crc kubenswrapper[4975]: I0318 13:04:02.016767 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:04:02 crc kubenswrapper[4975]: E0318 13:04:02.016999 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:04:02 crc kubenswrapper[4975]: I0318 13:04:02.796135 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563984-j4rgx" event={"ID":"44fdccba-8590-4cbf-9128-93e1738a6909","Type":"ContainerStarted","Data":"0ddb6273c6a772f4bfc1305848f7e126880998306439a99942521c5baa3013b4"} Mar 18 13:04:02 crc kubenswrapper[4975]: I0318 13:04:02.815312 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563984-j4rgx" podStartSLOduration=1.644567174 podStartE2EDuration="2.815293722s" podCreationTimestamp="2026-03-18 13:04:00 +0000 UTC" firstStartedPulling="2026-03-18 13:04:01.001813957 +0000 UTC m=+3226.716214586" lastFinishedPulling="2026-03-18 13:04:02.172540555 +0000 UTC m=+3227.886941134" observedRunningTime="2026-03-18 13:04:02.807841599 +0000 UTC m=+3228.522242178" watchObservedRunningTime="2026-03-18 13:04:02.815293722 +0000 UTC m=+3228.529694301" Mar 18 13:04:03 crc kubenswrapper[4975]: I0318 13:04:03.806524 4975 generic.go:334] "Generic (PLEG): container finished" podID="44fdccba-8590-4cbf-9128-93e1738a6909" containerID="0ddb6273c6a772f4bfc1305848f7e126880998306439a99942521c5baa3013b4" exitCode=0 Mar 18 13:04:03 crc kubenswrapper[4975]: I0318 13:04:03.806574 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563984-j4rgx" event={"ID":"44fdccba-8590-4cbf-9128-93e1738a6909","Type":"ContainerDied","Data":"0ddb6273c6a772f4bfc1305848f7e126880998306439a99942521c5baa3013b4"} Mar 18 13:04:05 crc kubenswrapper[4975]: I0318 13:04:05.167687 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563984-j4rgx" Mar 18 13:04:05 crc kubenswrapper[4975]: I0318 13:04:05.235380 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndsbj\" (UniqueName: \"kubernetes.io/projected/44fdccba-8590-4cbf-9128-93e1738a6909-kube-api-access-ndsbj\") pod \"44fdccba-8590-4cbf-9128-93e1738a6909\" (UID: \"44fdccba-8590-4cbf-9128-93e1738a6909\") " Mar 18 13:04:05 crc kubenswrapper[4975]: I0318 13:04:05.241667 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44fdccba-8590-4cbf-9128-93e1738a6909-kube-api-access-ndsbj" (OuterVolumeSpecName: "kube-api-access-ndsbj") pod "44fdccba-8590-4cbf-9128-93e1738a6909" (UID: "44fdccba-8590-4cbf-9128-93e1738a6909"). InnerVolumeSpecName "kube-api-access-ndsbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:05 crc kubenswrapper[4975]: I0318 13:04:05.337672 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndsbj\" (UniqueName: \"kubernetes.io/projected/44fdccba-8590-4cbf-9128-93e1738a6909-kube-api-access-ndsbj\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:05 crc kubenswrapper[4975]: I0318 13:04:05.832593 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563984-j4rgx" event={"ID":"44fdccba-8590-4cbf-9128-93e1738a6909","Type":"ContainerDied","Data":"97c28668896bf532086e3019a3a47832955ed8d4c3212b6e63c1a33431469a19"} Mar 18 13:04:05 crc kubenswrapper[4975]: I0318 13:04:05.833059 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97c28668896bf532086e3019a3a47832955ed8d4c3212b6e63c1a33431469a19" Mar 18 13:04:05 crc kubenswrapper[4975]: I0318 13:04:05.832691 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563984-j4rgx" Mar 18 13:04:05 crc kubenswrapper[4975]: I0318 13:04:05.907092 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563978-5g5qb"] Mar 18 13:04:05 crc kubenswrapper[4975]: I0318 13:04:05.918329 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563978-5g5qb"] Mar 18 13:04:07 crc kubenswrapper[4975]: I0318 13:04:07.028375 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9aebc1-ef2c-497e-97e1-793bdbacd425" path="/var/lib/kubelet/pods/5e9aebc1-ef2c-497e-97e1-793bdbacd425/volumes" Mar 18 13:04:07 crc kubenswrapper[4975]: I0318 13:04:07.854894 4975 generic.go:334] "Generic (PLEG): container finished" podID="c5353d60-ff65-4a67-a566-00ef9a757cfb" containerID="fb1abfc582c9ae45ae19201d607e4d9e2a41fba200bbaaa9be3501273a7ddf09" exitCode=2 Mar 18 13:04:07 crc kubenswrapper[4975]: I0318 13:04:07.854956 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" event={"ID":"c5353d60-ff65-4a67-a566-00ef9a757cfb","Type":"ContainerDied","Data":"fb1abfc582c9ae45ae19201d607e4d9e2a41fba200bbaaa9be3501273a7ddf09"} Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.273058 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.323736 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-ssh-key-openstack-edpm-ipam\") pod \"c5353d60-ff65-4a67-a566-00ef9a757cfb\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.323887 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxcjr\" (UniqueName: \"kubernetes.io/projected/c5353d60-ff65-4a67-a566-00ef9a757cfb-kube-api-access-vxcjr\") pod \"c5353d60-ff65-4a67-a566-00ef9a757cfb\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.323926 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-libvirt-secret-0\") pod \"c5353d60-ff65-4a67-a566-00ef9a757cfb\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.324071 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-inventory\") pod \"c5353d60-ff65-4a67-a566-00ef9a757cfb\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.324124 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-libvirt-combined-ca-bundle\") pod \"c5353d60-ff65-4a67-a566-00ef9a757cfb\" (UID: \"c5353d60-ff65-4a67-a566-00ef9a757cfb\") " Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.330533 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5353d60-ff65-4a67-a566-00ef9a757cfb-kube-api-access-vxcjr" (OuterVolumeSpecName: "kube-api-access-vxcjr") pod "c5353d60-ff65-4a67-a566-00ef9a757cfb" (UID: "c5353d60-ff65-4a67-a566-00ef9a757cfb"). InnerVolumeSpecName "kube-api-access-vxcjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.331111 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c5353d60-ff65-4a67-a566-00ef9a757cfb" (UID: "c5353d60-ff65-4a67-a566-00ef9a757cfb"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.362581 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "c5353d60-ff65-4a67-a566-00ef9a757cfb" (UID: "c5353d60-ff65-4a67-a566-00ef9a757cfb"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.362997 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-inventory" (OuterVolumeSpecName: "inventory") pod "c5353d60-ff65-4a67-a566-00ef9a757cfb" (UID: "c5353d60-ff65-4a67-a566-00ef9a757cfb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.367497 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c5353d60-ff65-4a67-a566-00ef9a757cfb" (UID: "c5353d60-ff65-4a67-a566-00ef9a757cfb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.425821 4975 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.425855 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.425881 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxcjr\" (UniqueName: \"kubernetes.io/projected/c5353d60-ff65-4a67-a566-00ef9a757cfb-kube-api-access-vxcjr\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.425894 4975 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.425906 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5353d60-ff65-4a67-a566-00ef9a757cfb-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.879666 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" event={"ID":"c5353d60-ff65-4a67-a566-00ef9a757cfb","Type":"ContainerDied","Data":"6e398807adfd475317cd23a7c0cf28bf3e7376adce3bb099f5e0b2e045dd7d96"} Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.879722 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e398807adfd475317cd23a7c0cf28bf3e7376adce3bb099f5e0b2e045dd7d96" Mar 18 13:04:09 crc kubenswrapper[4975]: I0318 13:04:09.879732 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52whs" Mar 18 13:04:17 crc kubenswrapper[4975]: I0318 13:04:17.016807 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:04:17 crc kubenswrapper[4975]: E0318 13:04:17.017855 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:04:32 crc kubenswrapper[4975]: I0318 13:04:32.016911 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:04:33 crc kubenswrapper[4975]: I0318 13:04:33.192398 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"944195ba0ad1578298e3c6fc0e21df9b364042964b2ee984b02bc981fb450f3a"} Mar 18 13:04:48 crc kubenswrapper[4975]: I0318 13:04:48.469712 4975 scope.go:117] "RemoveContainer" containerID="bdb336c5a5a94a634be016759623a8a39151e69be61cf4e069d329bcc93d7fc9" Mar 18 13:06:00 crc kubenswrapper[4975]: I0318 13:06:00.149565 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563986-279c5"] Mar 18 13:06:00 crc kubenswrapper[4975]: E0318 13:06:00.150554 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5353d60-ff65-4a67-a566-00ef9a757cfb" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:06:00 crc kubenswrapper[4975]: I0318 13:06:00.150568 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5353d60-ff65-4a67-a566-00ef9a757cfb" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:06:00 crc kubenswrapper[4975]: E0318 13:06:00.150590 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44fdccba-8590-4cbf-9128-93e1738a6909" containerName="oc" Mar 18 13:06:00 crc kubenswrapper[4975]: I0318 13:06:00.150596 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="44fdccba-8590-4cbf-9128-93e1738a6909" containerName="oc" Mar 18 13:06:00 crc kubenswrapper[4975]: I0318 13:06:00.150813 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5353d60-ff65-4a67-a566-00ef9a757cfb" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:06:00 crc kubenswrapper[4975]: I0318 13:06:00.150825 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="44fdccba-8590-4cbf-9128-93e1738a6909" containerName="oc" Mar 18 13:06:00 crc kubenswrapper[4975]: I0318 13:06:00.151527 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-279c5" Mar 18 13:06:00 crc kubenswrapper[4975]: I0318 13:06:00.154613 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:06:00 crc kubenswrapper[4975]: I0318 13:06:00.154878 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:06:00 crc kubenswrapper[4975]: I0318 13:06:00.155098 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:06:00 crc kubenswrapper[4975]: I0318 13:06:00.159851 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-279c5"] Mar 18 13:06:00 crc kubenswrapper[4975]: I0318 13:06:00.197661 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdpkk\" (UniqueName: \"kubernetes.io/projected/0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb-kube-api-access-cdpkk\") pod \"auto-csr-approver-29563986-279c5\" (UID: \"0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb\") " pod="openshift-infra/auto-csr-approver-29563986-279c5" Mar 18 13:06:00 crc kubenswrapper[4975]: I0318 13:06:00.298976 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdpkk\" (UniqueName: \"kubernetes.io/projected/0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb-kube-api-access-cdpkk\") pod \"auto-csr-approver-29563986-279c5\" (UID: \"0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb\") " pod="openshift-infra/auto-csr-approver-29563986-279c5" Mar 18 13:06:00 crc kubenswrapper[4975]: I0318 13:06:00.319390 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdpkk\" (UniqueName: \"kubernetes.io/projected/0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb-kube-api-access-cdpkk\") pod \"auto-csr-approver-29563986-279c5\" (UID: \"0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb\") " pod="openshift-infra/auto-csr-approver-29563986-279c5" Mar 18 13:06:00 crc kubenswrapper[4975]: I0318 13:06:00.472269 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-279c5" Mar 18 13:06:00 crc kubenswrapper[4975]: I0318 13:06:00.915524 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-279c5"] Mar 18 13:06:01 crc kubenswrapper[4975]: I0318 13:06:01.004218 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563986-279c5" event={"ID":"0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb","Type":"ContainerStarted","Data":"60daa325d3afee5c9f827b93f87cdd5af88eefcced6e46437320aaf54fdbcfcc"} Mar 18 13:06:03 crc kubenswrapper[4975]: I0318 13:06:03.023780 4975 generic.go:334] "Generic (PLEG): container finished" podID="0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb" containerID="24dc7b1ce1e1d1f71f6b730a952d2ee589b8a774d2df3ea9f292924cacd27abb" exitCode=0 Mar 18 13:06:03 crc kubenswrapper[4975]: I0318 13:06:03.027770 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563986-279c5" event={"ID":"0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb","Type":"ContainerDied","Data":"24dc7b1ce1e1d1f71f6b730a952d2ee589b8a774d2df3ea9f292924cacd27abb"} Mar 18 13:06:04 crc kubenswrapper[4975]: I0318 13:06:04.399600 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-279c5" Mar 18 13:06:04 crc kubenswrapper[4975]: I0318 13:06:04.581666 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdpkk\" (UniqueName: \"kubernetes.io/projected/0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb-kube-api-access-cdpkk\") pod \"0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb\" (UID: \"0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb\") " Mar 18 13:06:04 crc kubenswrapper[4975]: I0318 13:06:04.588630 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb-kube-api-access-cdpkk" (OuterVolumeSpecName: "kube-api-access-cdpkk") pod "0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb" (UID: "0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb"). InnerVolumeSpecName "kube-api-access-cdpkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:06:04 crc kubenswrapper[4975]: I0318 13:06:04.683411 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdpkk\" (UniqueName: \"kubernetes.io/projected/0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb-kube-api-access-cdpkk\") on node \"crc\" DevicePath \"\"" Mar 18 13:06:05 crc kubenswrapper[4975]: I0318 13:06:05.043278 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563986-279c5" event={"ID":"0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb","Type":"ContainerDied","Data":"60daa325d3afee5c9f827b93f87cdd5af88eefcced6e46437320aaf54fdbcfcc"} Mar 18 13:06:05 crc kubenswrapper[4975]: I0318 13:06:05.043313 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60daa325d3afee5c9f827b93f87cdd5af88eefcced6e46437320aaf54fdbcfcc" Mar 18 13:06:05 crc kubenswrapper[4975]: I0318 13:06:05.043363 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563986-279c5" Mar 18 13:06:05 crc kubenswrapper[4975]: I0318 13:06:05.483331 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563980-kg8bn"] Mar 18 13:06:05 crc kubenswrapper[4975]: I0318 13:06:05.497334 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563980-kg8bn"] Mar 18 13:06:07 crc kubenswrapper[4975]: I0318 13:06:07.031452 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c54475a-e898-4247-bcca-111f5072e571" path="/var/lib/kubelet/pods/8c54475a-e898-4247-bcca-111f5072e571/volumes" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.040133 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n"] Mar 18 13:06:46 crc kubenswrapper[4975]: E0318 13:06:46.041175 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb" containerName="oc" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.041195 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb" containerName="oc" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.041489 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb" containerName="oc" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.042324 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.047154 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.048056 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.048163 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.048270 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.048550 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.068479 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n"] Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.209393 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.209540 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.209589 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmf9p\" (UniqueName: \"kubernetes.io/projected/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-kube-api-access-nmf9p\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.209617 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.209690 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.311891 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmf9p\" (UniqueName: \"kubernetes.io/projected/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-kube-api-access-nmf9p\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.311963 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.312021 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.312065 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.312205 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.320787 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.321421 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.321517 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.325996 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.333854 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmf9p\" (UniqueName: \"kubernetes.io/projected/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-kube-api-access-nmf9p\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.367908 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:06:46 crc kubenswrapper[4975]: I0318 13:06:46.924443 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n"] Mar 18 13:06:47 crc kubenswrapper[4975]: I0318 13:06:47.409628 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" event={"ID":"02e6a4ac-05a2-4da9-8494-3e01a9d443ff","Type":"ContainerStarted","Data":"abe4bb19009e48f8991d09debc4da32580dc94b611a00951235fbf1bd35e7c3e"} Mar 18 13:06:48 crc kubenswrapper[4975]: I0318 13:06:48.422693 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" event={"ID":"02e6a4ac-05a2-4da9-8494-3e01a9d443ff","Type":"ContainerStarted","Data":"ad5721e10365b53e544617923b76e9b3fa9dba1f904ecb726fa9c25cee92db05"} Mar 18 13:06:48 crc kubenswrapper[4975]: I0318 13:06:48.446617 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" podStartSLOduration=2.117787901 podStartE2EDuration="2.44655351s" podCreationTimestamp="2026-03-18 13:06:46 +0000 UTC" firstStartedPulling="2026-03-18 13:06:46.916958732 +0000 UTC m=+3392.631359311" lastFinishedPulling="2026-03-18 13:06:47.245724341 +0000 UTC m=+3392.960124920" observedRunningTime="2026-03-18 13:06:48.435390596 +0000 UTC m=+3394.149791175" watchObservedRunningTime="2026-03-18 13:06:48.44655351 +0000 UTC m=+3394.160954089" Mar 18 13:06:48 crc kubenswrapper[4975]: I0318 13:06:48.591565 4975 scope.go:117] "RemoveContainer" containerID="71f3f594e4ee934fdbde53fec5669ee35feccbb04839b9be53fe40928b76c99f" Mar 18 13:06:55 crc kubenswrapper[4975]: I0318 13:06:55.539089 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:06:55 crc kubenswrapper[4975]: I0318 13:06:55.539645 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:07:25 crc kubenswrapper[4975]: I0318 13:07:25.538960 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:07:25 crc kubenswrapper[4975]: I0318 13:07:25.539492 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:07:55 crc kubenswrapper[4975]: I0318 13:07:55.538822 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:07:55 crc kubenswrapper[4975]: I0318 13:07:55.539395 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:07:55 crc kubenswrapper[4975]: I0318 13:07:55.539454 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 13:07:55 crc kubenswrapper[4975]: I0318 13:07:55.540114 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"944195ba0ad1578298e3c6fc0e21df9b364042964b2ee984b02bc981fb450f3a"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:07:55 crc kubenswrapper[4975]: I0318 13:07:55.540178 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://944195ba0ad1578298e3c6fc0e21df9b364042964b2ee984b02bc981fb450f3a" gracePeriod=600 Mar 18 13:07:55 crc kubenswrapper[4975]: E0318 13:07:55.799546 4975 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59dd8f35_75c5_42d7_b11a_06586d1d5a1b.slice/crio-944195ba0ad1578298e3c6fc0e21df9b364042964b2ee984b02bc981fb450f3a.scope\": RecentStats: unable to find data in memory cache]" Mar 18 13:07:56 crc kubenswrapper[4975]: I0318 13:07:56.168499 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="944195ba0ad1578298e3c6fc0e21df9b364042964b2ee984b02bc981fb450f3a" exitCode=0 Mar 18 13:07:56 crc kubenswrapper[4975]: I0318 13:07:56.168806 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"944195ba0ad1578298e3c6fc0e21df9b364042964b2ee984b02bc981fb450f3a"} Mar 18 13:07:56 crc kubenswrapper[4975]: I0318 13:07:56.168974 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3"} Mar 18 13:07:56 crc kubenswrapper[4975]: I0318 13:07:56.169020 4975 scope.go:117] "RemoveContainer" containerID="2936c9228a883c9fc8881a59a490b16892b4c8d66426a436d41a951451b52b46" Mar 18 13:08:00 crc kubenswrapper[4975]: I0318 13:08:00.152039 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563988-bx2df"] Mar 18 13:08:00 crc kubenswrapper[4975]: I0318 13:08:00.154712 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-bx2df" Mar 18 13:08:00 crc kubenswrapper[4975]: I0318 13:08:00.158340 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:08:00 crc kubenswrapper[4975]: I0318 13:08:00.158349 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:08:00 crc kubenswrapper[4975]: I0318 13:08:00.158388 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:08:00 crc kubenswrapper[4975]: I0318 13:08:00.163362 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-bx2df"] Mar 18 13:08:00 crc kubenswrapper[4975]: I0318 13:08:00.295955 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cddhx\" (UniqueName: \"kubernetes.io/projected/710e8a6f-4426-4ac3-af0b-0e30524a784e-kube-api-access-cddhx\") pod \"auto-csr-approver-29563988-bx2df\" (UID: \"710e8a6f-4426-4ac3-af0b-0e30524a784e\") " pod="openshift-infra/auto-csr-approver-29563988-bx2df" Mar 18 13:08:00 crc kubenswrapper[4975]: I0318 13:08:00.398556 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cddhx\" (UniqueName: \"kubernetes.io/projected/710e8a6f-4426-4ac3-af0b-0e30524a784e-kube-api-access-cddhx\") pod \"auto-csr-approver-29563988-bx2df\" (UID: \"710e8a6f-4426-4ac3-af0b-0e30524a784e\") " pod="openshift-infra/auto-csr-approver-29563988-bx2df" Mar 18 13:08:00 crc kubenswrapper[4975]: I0318 13:08:00.423105 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cddhx\" (UniqueName: \"kubernetes.io/projected/710e8a6f-4426-4ac3-af0b-0e30524a784e-kube-api-access-cddhx\") pod \"auto-csr-approver-29563988-bx2df\" (UID: \"710e8a6f-4426-4ac3-af0b-0e30524a784e\") " pod="openshift-infra/auto-csr-approver-29563988-bx2df" Mar 18 13:08:00 crc kubenswrapper[4975]: I0318 13:08:00.477855 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-bx2df" Mar 18 13:08:00 crc kubenswrapper[4975]: I0318 13:08:00.972137 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-bx2df"] Mar 18 13:08:00 crc kubenswrapper[4975]: I0318 13:08:00.979474 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:08:01 crc kubenswrapper[4975]: I0318 13:08:01.217001 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563988-bx2df" event={"ID":"710e8a6f-4426-4ac3-af0b-0e30524a784e","Type":"ContainerStarted","Data":"dfbf0dca9820e795451ebc48f5032b17305ae2cdae70f68603168f7834523d31"} Mar 18 13:08:03 crc kubenswrapper[4975]: I0318 13:08:03.233849 4975 generic.go:334] "Generic (PLEG): container finished" podID="710e8a6f-4426-4ac3-af0b-0e30524a784e" containerID="b4a6697f7cc9c1a19389026a8ce8237bddc7e97106c397e7c423ad63a2f78205" exitCode=0 Mar 18 13:08:03 crc kubenswrapper[4975]: I0318 13:08:03.233921 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563988-bx2df" event={"ID":"710e8a6f-4426-4ac3-af0b-0e30524a784e","Type":"ContainerDied","Data":"b4a6697f7cc9c1a19389026a8ce8237bddc7e97106c397e7c423ad63a2f78205"} Mar 18 13:08:04 crc kubenswrapper[4975]: I0318 13:08:04.579884 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-bx2df" Mar 18 13:08:04 crc kubenswrapper[4975]: I0318 13:08:04.680761 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cddhx\" (UniqueName: \"kubernetes.io/projected/710e8a6f-4426-4ac3-af0b-0e30524a784e-kube-api-access-cddhx\") pod \"710e8a6f-4426-4ac3-af0b-0e30524a784e\" (UID: \"710e8a6f-4426-4ac3-af0b-0e30524a784e\") " Mar 18 13:08:04 crc kubenswrapper[4975]: I0318 13:08:04.688218 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/710e8a6f-4426-4ac3-af0b-0e30524a784e-kube-api-access-cddhx" (OuterVolumeSpecName: "kube-api-access-cddhx") pod "710e8a6f-4426-4ac3-af0b-0e30524a784e" (UID: "710e8a6f-4426-4ac3-af0b-0e30524a784e"). InnerVolumeSpecName "kube-api-access-cddhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:08:04 crc kubenswrapper[4975]: I0318 13:08:04.782289 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cddhx\" (UniqueName: \"kubernetes.io/projected/710e8a6f-4426-4ac3-af0b-0e30524a784e-kube-api-access-cddhx\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:05 crc kubenswrapper[4975]: I0318 13:08:05.254880 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563988-bx2df" event={"ID":"710e8a6f-4426-4ac3-af0b-0e30524a784e","Type":"ContainerDied","Data":"dfbf0dca9820e795451ebc48f5032b17305ae2cdae70f68603168f7834523d31"} Mar 18 13:08:05 crc kubenswrapper[4975]: I0318 13:08:05.254930 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfbf0dca9820e795451ebc48f5032b17305ae2cdae70f68603168f7834523d31" Mar 18 13:08:05 crc kubenswrapper[4975]: I0318 13:08:05.254978 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563988-bx2df" Mar 18 13:08:05 crc kubenswrapper[4975]: I0318 13:08:05.676553 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563982-k9f8w"] Mar 18 13:08:05 crc kubenswrapper[4975]: I0318 13:08:05.684814 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563982-k9f8w"] Mar 18 13:08:07 crc kubenswrapper[4975]: I0318 13:08:07.028745 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c8c176-21ac-4505-b1f6-31de8ddb3989" path="/var/lib/kubelet/pods/e3c8c176-21ac-4505-b1f6-31de8ddb3989/volumes" Mar 18 13:08:28 crc kubenswrapper[4975]: I0318 13:08:28.462527 4975 generic.go:334] "Generic (PLEG): container finished" podID="02e6a4ac-05a2-4da9-8494-3e01a9d443ff" containerID="ad5721e10365b53e544617923b76e9b3fa9dba1f904ecb726fa9c25cee92db05" exitCode=2 Mar 18 13:08:28 crc kubenswrapper[4975]: I0318 13:08:28.462610 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" event={"ID":"02e6a4ac-05a2-4da9-8494-3e01a9d443ff","Type":"ContainerDied","Data":"ad5721e10365b53e544617923b76e9b3fa9dba1f904ecb726fa9c25cee92db05"} Mar 18 13:08:29 crc kubenswrapper[4975]: I0318 13:08:29.906045 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.048396 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmf9p\" (UniqueName: \"kubernetes.io/projected/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-kube-api-access-nmf9p\") pod \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.048948 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-inventory\") pod \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.049032 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-libvirt-secret-0\") pod \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.049600 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-libvirt-combined-ca-bundle\") pod \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.049669 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-ssh-key-openstack-edpm-ipam\") pod \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\" (UID: \"02e6a4ac-05a2-4da9-8494-3e01a9d443ff\") " Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.055220 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "02e6a4ac-05a2-4da9-8494-3e01a9d443ff" (UID: "02e6a4ac-05a2-4da9-8494-3e01a9d443ff"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.055636 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-kube-api-access-nmf9p" (OuterVolumeSpecName: "kube-api-access-nmf9p") pod "02e6a4ac-05a2-4da9-8494-3e01a9d443ff" (UID: "02e6a4ac-05a2-4da9-8494-3e01a9d443ff"). InnerVolumeSpecName "kube-api-access-nmf9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.081591 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "02e6a4ac-05a2-4da9-8494-3e01a9d443ff" (UID: "02e6a4ac-05a2-4da9-8494-3e01a9d443ff"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.082231 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "02e6a4ac-05a2-4da9-8494-3e01a9d443ff" (UID: "02e6a4ac-05a2-4da9-8494-3e01a9d443ff"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.086009 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-inventory" (OuterVolumeSpecName: "inventory") pod "02e6a4ac-05a2-4da9-8494-3e01a9d443ff" (UID: "02e6a4ac-05a2-4da9-8494-3e01a9d443ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.155239 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmf9p\" (UniqueName: \"kubernetes.io/projected/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-kube-api-access-nmf9p\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.155679 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.155699 4975 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.155710 4975 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.155720 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02e6a4ac-05a2-4da9-8494-3e01a9d443ff-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.487148 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" event={"ID":"02e6a4ac-05a2-4da9-8494-3e01a9d443ff","Type":"ContainerDied","Data":"abe4bb19009e48f8991d09debc4da32580dc94b611a00951235fbf1bd35e7c3e"} Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.487722 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abe4bb19009e48f8991d09debc4da32580dc94b611a00951235fbf1bd35e7c3e" Mar 18 13:08:30 crc kubenswrapper[4975]: I0318 13:08:30.487192 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n" Mar 18 13:08:48 crc kubenswrapper[4975]: I0318 13:08:48.688833 4975 scope.go:117] "RemoveContainer" containerID="6b430f6f47c2cf504908eef61dd90288c1277f765b5341ade428fb42dc9b19f3" Mar 18 13:09:48 crc kubenswrapper[4975]: I0318 13:09:48.771056 4975 scope.go:117] "RemoveContainer" containerID="37c26cb0fadc3b82709043e02cdd18b7e16b091c1987ebcf595429fcc81ef7a3" Mar 18 13:09:48 crc kubenswrapper[4975]: I0318 13:09:48.796458 4975 scope.go:117] "RemoveContainer" containerID="20d3b2f268d170ff0f16219f2a54c3db56425f46c126680a6fda892109343642" Mar 18 13:09:48 crc kubenswrapper[4975]: I0318 13:09:48.833334 4975 scope.go:117] "RemoveContainer" containerID="b938e8eec273364a0d9c6c1e5a419c2b6e4fea5e833632dd9cd62d782bbc946f" Mar 18 13:09:55 crc kubenswrapper[4975]: I0318 13:09:55.538770 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:09:55 crc kubenswrapper[4975]: I0318 13:09:55.539330 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:10:00 crc kubenswrapper[4975]: I0318 13:10:00.147410 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563990-wnkm2"] Mar 18 13:10:00 crc kubenswrapper[4975]: E0318 13:10:00.148450 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="710e8a6f-4426-4ac3-af0b-0e30524a784e" containerName="oc" Mar 18 13:10:00 crc kubenswrapper[4975]: I0318 13:10:00.148469 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="710e8a6f-4426-4ac3-af0b-0e30524a784e" containerName="oc" Mar 18 13:10:00 crc kubenswrapper[4975]: E0318 13:10:00.148506 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e6a4ac-05a2-4da9-8494-3e01a9d443ff" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:10:00 crc kubenswrapper[4975]: I0318 13:10:00.148515 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e6a4ac-05a2-4da9-8494-3e01a9d443ff" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:10:00 crc kubenswrapper[4975]: I0318 13:10:00.148747 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="710e8a6f-4426-4ac3-af0b-0e30524a784e" containerName="oc" Mar 18 13:10:00 crc kubenswrapper[4975]: I0318 13:10:00.148807 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e6a4ac-05a2-4da9-8494-3e01a9d443ff" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:10:00 crc kubenswrapper[4975]: I0318 13:10:00.149801 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-wnkm2" Mar 18 13:10:00 crc kubenswrapper[4975]: I0318 13:10:00.156500 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:10:00 crc kubenswrapper[4975]: I0318 13:10:00.156632 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:10:00 crc kubenswrapper[4975]: I0318 13:10:00.156791 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:10:00 crc kubenswrapper[4975]: I0318 13:10:00.158427 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-wnkm2"] Mar 18 13:10:00 crc kubenswrapper[4975]: I0318 13:10:00.246270 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jfcq\" (UniqueName: \"kubernetes.io/projected/151a5e6d-1835-4362-97d5-ecc9d5e22843-kube-api-access-6jfcq\") pod \"auto-csr-approver-29563990-wnkm2\" (UID: \"151a5e6d-1835-4362-97d5-ecc9d5e22843\") " pod="openshift-infra/auto-csr-approver-29563990-wnkm2" Mar 18 13:10:00 crc kubenswrapper[4975]: I0318 13:10:00.348770 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jfcq\" (UniqueName: \"kubernetes.io/projected/151a5e6d-1835-4362-97d5-ecc9d5e22843-kube-api-access-6jfcq\") pod \"auto-csr-approver-29563990-wnkm2\" (UID: \"151a5e6d-1835-4362-97d5-ecc9d5e22843\") " pod="openshift-infra/auto-csr-approver-29563990-wnkm2" Mar 18 13:10:00 crc kubenswrapper[4975]: I0318 13:10:00.386071 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jfcq\" (UniqueName: \"kubernetes.io/projected/151a5e6d-1835-4362-97d5-ecc9d5e22843-kube-api-access-6jfcq\") pod \"auto-csr-approver-29563990-wnkm2\" (UID: \"151a5e6d-1835-4362-97d5-ecc9d5e22843\") " pod="openshift-infra/auto-csr-approver-29563990-wnkm2" Mar 18 13:10:00 crc kubenswrapper[4975]: I0318 13:10:00.472721 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-wnkm2" Mar 18 13:10:00 crc kubenswrapper[4975]: I0318 13:10:00.933645 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-wnkm2"] Mar 18 13:10:00 crc kubenswrapper[4975]: W0318 13:10:00.940431 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod151a5e6d_1835_4362_97d5_ecc9d5e22843.slice/crio-a727674a715407930cfa6f64d9e99710f09c4f307fdcdb25f90294d79fc0926d WatchSource:0}: Error finding container a727674a715407930cfa6f64d9e99710f09c4f307fdcdb25f90294d79fc0926d: Status 404 returned error can't find the container with id a727674a715407930cfa6f64d9e99710f09c4f307fdcdb25f90294d79fc0926d Mar 18 13:10:01 crc kubenswrapper[4975]: I0318 13:10:01.308576 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563990-wnkm2" event={"ID":"151a5e6d-1835-4362-97d5-ecc9d5e22843","Type":"ContainerStarted","Data":"a727674a715407930cfa6f64d9e99710f09c4f307fdcdb25f90294d79fc0926d"} Mar 18 13:10:02 crc kubenswrapper[4975]: I0318 13:10:02.318931 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563990-wnkm2" event={"ID":"151a5e6d-1835-4362-97d5-ecc9d5e22843","Type":"ContainerStarted","Data":"656c57f54094e5b48e1c433e74566c7910d037492cc4c687368a8a4d4b9e96b2"} Mar 18 13:10:02 crc kubenswrapper[4975]: I0318 13:10:02.346627 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563990-wnkm2" podStartSLOduration=1.293940878 podStartE2EDuration="2.346563434s" podCreationTimestamp="2026-03-18 13:10:00 +0000 UTC" firstStartedPulling="2026-03-18 13:10:00.944764332 +0000 UTC m=+3586.659164911" lastFinishedPulling="2026-03-18 13:10:01.997386888 +0000 UTC m=+3587.711787467" observedRunningTime="2026-03-18 13:10:02.331413731 +0000 UTC m=+3588.045814330" watchObservedRunningTime="2026-03-18 13:10:02.346563434 +0000 UTC m=+3588.060964033" Mar 18 13:10:03 crc kubenswrapper[4975]: I0318 13:10:03.332908 4975 generic.go:334] "Generic (PLEG): container finished" podID="151a5e6d-1835-4362-97d5-ecc9d5e22843" containerID="656c57f54094e5b48e1c433e74566c7910d037492cc4c687368a8a4d4b9e96b2" exitCode=0 Mar 18 13:10:03 crc kubenswrapper[4975]: I0318 13:10:03.333376 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563990-wnkm2" event={"ID":"151a5e6d-1835-4362-97d5-ecc9d5e22843","Type":"ContainerDied","Data":"656c57f54094e5b48e1c433e74566c7910d037492cc4c687368a8a4d4b9e96b2"} Mar 18 13:10:04 crc kubenswrapper[4975]: I0318 13:10:04.679089 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-wnkm2" Mar 18 13:10:04 crc kubenswrapper[4975]: I0318 13:10:04.826947 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jfcq\" (UniqueName: \"kubernetes.io/projected/151a5e6d-1835-4362-97d5-ecc9d5e22843-kube-api-access-6jfcq\") pod \"151a5e6d-1835-4362-97d5-ecc9d5e22843\" (UID: \"151a5e6d-1835-4362-97d5-ecc9d5e22843\") " Mar 18 13:10:04 crc kubenswrapper[4975]: I0318 13:10:04.832700 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/151a5e6d-1835-4362-97d5-ecc9d5e22843-kube-api-access-6jfcq" (OuterVolumeSpecName: "kube-api-access-6jfcq") pod "151a5e6d-1835-4362-97d5-ecc9d5e22843" (UID: "151a5e6d-1835-4362-97d5-ecc9d5e22843"). InnerVolumeSpecName "kube-api-access-6jfcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:10:04 crc kubenswrapper[4975]: I0318 13:10:04.929903 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jfcq\" (UniqueName: \"kubernetes.io/projected/151a5e6d-1835-4362-97d5-ecc9d5e22843-kube-api-access-6jfcq\") on node \"crc\" DevicePath \"\"" Mar 18 13:10:05 crc kubenswrapper[4975]: I0318 13:10:05.352313 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563990-wnkm2" event={"ID":"151a5e6d-1835-4362-97d5-ecc9d5e22843","Type":"ContainerDied","Data":"a727674a715407930cfa6f64d9e99710f09c4f307fdcdb25f90294d79fc0926d"} Mar 18 13:10:05 crc kubenswrapper[4975]: I0318 13:10:05.352348 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a727674a715407930cfa6f64d9e99710f09c4f307fdcdb25f90294d79fc0926d" Mar 18 13:10:05 crc kubenswrapper[4975]: I0318 13:10:05.352383 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563990-wnkm2" Mar 18 13:10:05 crc kubenswrapper[4975]: I0318 13:10:05.405334 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563984-j4rgx"] Mar 18 13:10:05 crc kubenswrapper[4975]: I0318 13:10:05.413127 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563984-j4rgx"] Mar 18 13:10:07 crc kubenswrapper[4975]: I0318 13:10:07.035201 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44fdccba-8590-4cbf-9128-93e1738a6909" path="/var/lib/kubelet/pods/44fdccba-8590-4cbf-9128-93e1738a6909/volumes" Mar 18 13:10:25 crc kubenswrapper[4975]: I0318 13:10:25.538800 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:10:25 crc kubenswrapper[4975]: I0318 13:10:25.539433 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:10:48 crc kubenswrapper[4975]: I0318 13:10:48.898365 4975 scope.go:117] "RemoveContainer" containerID="0ddb6273c6a772f4bfc1305848f7e126880998306439a99942521c5baa3013b4" Mar 18 13:10:55 crc kubenswrapper[4975]: I0318 13:10:55.540937 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:10:55 crc kubenswrapper[4975]: I0318 13:10:55.541493 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:10:55 crc kubenswrapper[4975]: I0318 13:10:55.541532 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 13:10:55 crc kubenswrapper[4975]: I0318 13:10:55.542024 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:10:55 crc kubenswrapper[4975]: I0318 13:10:55.542069 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" gracePeriod=600 Mar 18 13:10:55 crc kubenswrapper[4975]: E0318 13:10:55.661312 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:10:56 crc kubenswrapper[4975]: I0318 13:10:56.360961 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" exitCode=0 Mar 18 13:10:56 crc kubenswrapper[4975]: I0318 13:10:56.361030 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3"} Mar 18 13:10:56 crc kubenswrapper[4975]: I0318 13:10:56.361092 4975 scope.go:117] "RemoveContainer" containerID="944195ba0ad1578298e3c6fc0e21df9b364042964b2ee984b02bc981fb450f3a" Mar 18 13:10:56 crc kubenswrapper[4975]: I0318 13:10:56.362044 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:10:56 crc kubenswrapper[4975]: E0318 13:10:56.362578 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:11:04 crc kubenswrapper[4975]: I0318 13:11:04.435490 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7pgd4"] Mar 18 13:11:04 crc kubenswrapper[4975]: E0318 13:11:04.436536 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151a5e6d-1835-4362-97d5-ecc9d5e22843" containerName="oc" Mar 18 13:11:04 crc kubenswrapper[4975]: I0318 13:11:04.436556 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="151a5e6d-1835-4362-97d5-ecc9d5e22843" containerName="oc" Mar 18 13:11:04 crc kubenswrapper[4975]: I0318 13:11:04.436824 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="151a5e6d-1835-4362-97d5-ecc9d5e22843" containerName="oc" Mar 18 13:11:04 crc kubenswrapper[4975]: I0318 13:11:04.438410 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:04 crc kubenswrapper[4975]: I0318 13:11:04.448808 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7pgd4"] Mar 18 13:11:04 crc kubenswrapper[4975]: I0318 13:11:04.569413 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9mx6\" (UniqueName: \"kubernetes.io/projected/6313bb47-eb74-4f25-917d-630031dc93e4-kube-api-access-x9mx6\") pod \"community-operators-7pgd4\" (UID: \"6313bb47-eb74-4f25-917d-630031dc93e4\") " pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:04 crc kubenswrapper[4975]: I0318 13:11:04.569471 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6313bb47-eb74-4f25-917d-630031dc93e4-catalog-content\") pod \"community-operators-7pgd4\" (UID: \"6313bb47-eb74-4f25-917d-630031dc93e4\") " pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:04 crc kubenswrapper[4975]: I0318 13:11:04.569503 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6313bb47-eb74-4f25-917d-630031dc93e4-utilities\") pod \"community-operators-7pgd4\" (UID: \"6313bb47-eb74-4f25-917d-630031dc93e4\") " pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:04 crc kubenswrapper[4975]: I0318 13:11:04.678094 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9mx6\" (UniqueName: \"kubernetes.io/projected/6313bb47-eb74-4f25-917d-630031dc93e4-kube-api-access-x9mx6\") pod \"community-operators-7pgd4\" (UID: \"6313bb47-eb74-4f25-917d-630031dc93e4\") " pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:04 crc kubenswrapper[4975]: I0318 13:11:04.678193 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6313bb47-eb74-4f25-917d-630031dc93e4-catalog-content\") pod \"community-operators-7pgd4\" (UID: \"6313bb47-eb74-4f25-917d-630031dc93e4\") " pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:04 crc kubenswrapper[4975]: I0318 13:11:04.678230 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6313bb47-eb74-4f25-917d-630031dc93e4-utilities\") pod \"community-operators-7pgd4\" (UID: \"6313bb47-eb74-4f25-917d-630031dc93e4\") " pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:04 crc kubenswrapper[4975]: I0318 13:11:04.678647 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6313bb47-eb74-4f25-917d-630031dc93e4-catalog-content\") pod \"community-operators-7pgd4\" (UID: \"6313bb47-eb74-4f25-917d-630031dc93e4\") " pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:04 crc kubenswrapper[4975]: I0318 13:11:04.678796 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6313bb47-eb74-4f25-917d-630031dc93e4-utilities\") pod \"community-operators-7pgd4\" (UID: \"6313bb47-eb74-4f25-917d-630031dc93e4\") " pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:04 crc kubenswrapper[4975]: I0318 13:11:04.704993 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9mx6\" (UniqueName: \"kubernetes.io/projected/6313bb47-eb74-4f25-917d-630031dc93e4-kube-api-access-x9mx6\") pod \"community-operators-7pgd4\" (UID: \"6313bb47-eb74-4f25-917d-630031dc93e4\") " pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:04 crc kubenswrapper[4975]: I0318 13:11:04.766502 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:05 crc kubenswrapper[4975]: I0318 13:11:05.267323 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7pgd4"] Mar 18 13:11:05 crc kubenswrapper[4975]: I0318 13:11:05.454933 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pgd4" event={"ID":"6313bb47-eb74-4f25-917d-630031dc93e4","Type":"ContainerStarted","Data":"e9fb4a97018ec7fe719353025e907d0c66902dfcc8d029a30a4b55bfdf620f04"} Mar 18 13:11:06 crc kubenswrapper[4975]: I0318 13:11:06.465401 4975 generic.go:334] "Generic (PLEG): container finished" podID="6313bb47-eb74-4f25-917d-630031dc93e4" containerID="72a6ab3623b7fdee08a936b0e3a1da8c3f30817d1208104552cb10021c5a47c2" exitCode=0 Mar 18 13:11:06 crc kubenswrapper[4975]: I0318 13:11:06.465446 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pgd4" event={"ID":"6313bb47-eb74-4f25-917d-630031dc93e4","Type":"ContainerDied","Data":"72a6ab3623b7fdee08a936b0e3a1da8c3f30817d1208104552cb10021c5a47c2"} Mar 18 13:11:09 crc kubenswrapper[4975]: I0318 13:11:09.498899 4975 generic.go:334] "Generic (PLEG): container finished" podID="6313bb47-eb74-4f25-917d-630031dc93e4" containerID="b4f8e877ba5b541db46e3cd26873bc6ecce3929a1b24aec115a124f5b1fda61c" exitCode=0 Mar 18 13:11:09 crc kubenswrapper[4975]: I0318 13:11:09.498996 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pgd4" event={"ID":"6313bb47-eb74-4f25-917d-630031dc93e4","Type":"ContainerDied","Data":"b4f8e877ba5b541db46e3cd26873bc6ecce3929a1b24aec115a124f5b1fda61c"} Mar 18 13:11:10 crc kubenswrapper[4975]: I0318 13:11:10.016046 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:11:10 crc kubenswrapper[4975]: E0318 13:11:10.016708 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:11:10 crc kubenswrapper[4975]: I0318 13:11:10.508729 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pgd4" event={"ID":"6313bb47-eb74-4f25-917d-630031dc93e4","Type":"ContainerStarted","Data":"15d85511b9c69aba94f7c91387635a2268325e2ddcec2c5e46edbefe1e22c330"} Mar 18 13:11:10 crc kubenswrapper[4975]: I0318 13:11:10.529095 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7pgd4" podStartSLOduration=2.9279624010000003 podStartE2EDuration="6.529075236s" podCreationTimestamp="2026-03-18 13:11:04 +0000 UTC" firstStartedPulling="2026-03-18 13:11:06.467345022 +0000 UTC m=+3652.181745621" lastFinishedPulling="2026-03-18 13:11:10.068457847 +0000 UTC m=+3655.782858456" observedRunningTime="2026-03-18 13:11:10.525719485 +0000 UTC m=+3656.240120064" watchObservedRunningTime="2026-03-18 13:11:10.529075236 +0000 UTC m=+3656.243475825" Mar 18 13:11:14 crc kubenswrapper[4975]: I0318 13:11:14.766613 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:14 crc kubenswrapper[4975]: I0318 13:11:14.767216 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:14 crc kubenswrapper[4975]: I0318 13:11:14.838102 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:15 crc kubenswrapper[4975]: I0318 13:11:15.598507 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:15 crc kubenswrapper[4975]: I0318 13:11:15.654759 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7pgd4"] Mar 18 13:11:17 crc kubenswrapper[4975]: I0318 13:11:17.572674 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7pgd4" podUID="6313bb47-eb74-4f25-917d-630031dc93e4" containerName="registry-server" containerID="cri-o://15d85511b9c69aba94f7c91387635a2268325e2ddcec2c5e46edbefe1e22c330" gracePeriod=2 Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.013455 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.058888 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9mx6\" (UniqueName: \"kubernetes.io/projected/6313bb47-eb74-4f25-917d-630031dc93e4-kube-api-access-x9mx6\") pod \"6313bb47-eb74-4f25-917d-630031dc93e4\" (UID: \"6313bb47-eb74-4f25-917d-630031dc93e4\") " Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.058961 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6313bb47-eb74-4f25-917d-630031dc93e4-catalog-content\") pod \"6313bb47-eb74-4f25-917d-630031dc93e4\" (UID: \"6313bb47-eb74-4f25-917d-630031dc93e4\") " Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.059116 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6313bb47-eb74-4f25-917d-630031dc93e4-utilities\") pod \"6313bb47-eb74-4f25-917d-630031dc93e4\" (UID: \"6313bb47-eb74-4f25-917d-630031dc93e4\") " Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.062274 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6313bb47-eb74-4f25-917d-630031dc93e4-utilities" (OuterVolumeSpecName: "utilities") pod "6313bb47-eb74-4f25-917d-630031dc93e4" (UID: "6313bb47-eb74-4f25-917d-630031dc93e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.068009 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6313bb47-eb74-4f25-917d-630031dc93e4-kube-api-access-x9mx6" (OuterVolumeSpecName: "kube-api-access-x9mx6") pod "6313bb47-eb74-4f25-917d-630031dc93e4" (UID: "6313bb47-eb74-4f25-917d-630031dc93e4"). InnerVolumeSpecName "kube-api-access-x9mx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.117251 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6313bb47-eb74-4f25-917d-630031dc93e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6313bb47-eb74-4f25-917d-630031dc93e4" (UID: "6313bb47-eb74-4f25-917d-630031dc93e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.160982 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9mx6\" (UniqueName: \"kubernetes.io/projected/6313bb47-eb74-4f25-917d-630031dc93e4-kube-api-access-x9mx6\") on node \"crc\" DevicePath \"\"" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.161015 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6313bb47-eb74-4f25-917d-630031dc93e4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.161026 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6313bb47-eb74-4f25-917d-630031dc93e4-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.582319 4975 generic.go:334] "Generic (PLEG): container finished" podID="6313bb47-eb74-4f25-917d-630031dc93e4" containerID="15d85511b9c69aba94f7c91387635a2268325e2ddcec2c5e46edbefe1e22c330" exitCode=0 Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.582371 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pgd4" event={"ID":"6313bb47-eb74-4f25-917d-630031dc93e4","Type":"ContainerDied","Data":"15d85511b9c69aba94f7c91387635a2268325e2ddcec2c5e46edbefe1e22c330"} Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.582403 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pgd4" event={"ID":"6313bb47-eb74-4f25-917d-630031dc93e4","Type":"ContainerDied","Data":"e9fb4a97018ec7fe719353025e907d0c66902dfcc8d029a30a4b55bfdf620f04"} Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.582424 4975 scope.go:117] "RemoveContainer" containerID="15d85511b9c69aba94f7c91387635a2268325e2ddcec2c5e46edbefe1e22c330" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.582376 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pgd4" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.605646 4975 scope.go:117] "RemoveContainer" containerID="b4f8e877ba5b541db46e3cd26873bc6ecce3929a1b24aec115a124f5b1fda61c" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.631709 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7pgd4"] Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.642926 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7pgd4"] Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.646990 4975 scope.go:117] "RemoveContainer" containerID="72a6ab3623b7fdee08a936b0e3a1da8c3f30817d1208104552cb10021c5a47c2" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.680355 4975 scope.go:117] "RemoveContainer" containerID="15d85511b9c69aba94f7c91387635a2268325e2ddcec2c5e46edbefe1e22c330" Mar 18 13:11:18 crc kubenswrapper[4975]: E0318 13:11:18.680836 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15d85511b9c69aba94f7c91387635a2268325e2ddcec2c5e46edbefe1e22c330\": container with ID starting with 15d85511b9c69aba94f7c91387635a2268325e2ddcec2c5e46edbefe1e22c330 not found: ID does not exist" containerID="15d85511b9c69aba94f7c91387635a2268325e2ddcec2c5e46edbefe1e22c330" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.680895 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15d85511b9c69aba94f7c91387635a2268325e2ddcec2c5e46edbefe1e22c330"} err="failed to get container status \"15d85511b9c69aba94f7c91387635a2268325e2ddcec2c5e46edbefe1e22c330\": rpc error: code = NotFound desc = could not find container \"15d85511b9c69aba94f7c91387635a2268325e2ddcec2c5e46edbefe1e22c330\": container with ID starting with 15d85511b9c69aba94f7c91387635a2268325e2ddcec2c5e46edbefe1e22c330 not found: ID does not exist" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.680928 4975 scope.go:117] "RemoveContainer" containerID="b4f8e877ba5b541db46e3cd26873bc6ecce3929a1b24aec115a124f5b1fda61c" Mar 18 13:11:18 crc kubenswrapper[4975]: E0318 13:11:18.681461 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4f8e877ba5b541db46e3cd26873bc6ecce3929a1b24aec115a124f5b1fda61c\": container with ID starting with b4f8e877ba5b541db46e3cd26873bc6ecce3929a1b24aec115a124f5b1fda61c not found: ID does not exist" containerID="b4f8e877ba5b541db46e3cd26873bc6ecce3929a1b24aec115a124f5b1fda61c" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.681486 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f8e877ba5b541db46e3cd26873bc6ecce3929a1b24aec115a124f5b1fda61c"} err="failed to get container status \"b4f8e877ba5b541db46e3cd26873bc6ecce3929a1b24aec115a124f5b1fda61c\": rpc error: code = NotFound desc = could not find container \"b4f8e877ba5b541db46e3cd26873bc6ecce3929a1b24aec115a124f5b1fda61c\": container with ID starting with b4f8e877ba5b541db46e3cd26873bc6ecce3929a1b24aec115a124f5b1fda61c not found: ID does not exist" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.681508 4975 scope.go:117] "RemoveContainer" containerID="72a6ab3623b7fdee08a936b0e3a1da8c3f30817d1208104552cb10021c5a47c2" Mar 18 13:11:18 crc kubenswrapper[4975]: E0318 13:11:18.682616 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72a6ab3623b7fdee08a936b0e3a1da8c3f30817d1208104552cb10021c5a47c2\": container with ID starting with 72a6ab3623b7fdee08a936b0e3a1da8c3f30817d1208104552cb10021c5a47c2 not found: ID does not exist" containerID="72a6ab3623b7fdee08a936b0e3a1da8c3f30817d1208104552cb10021c5a47c2" Mar 18 13:11:18 crc kubenswrapper[4975]: I0318 13:11:18.682648 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72a6ab3623b7fdee08a936b0e3a1da8c3f30817d1208104552cb10021c5a47c2"} err="failed to get container status \"72a6ab3623b7fdee08a936b0e3a1da8c3f30817d1208104552cb10021c5a47c2\": rpc error: code = NotFound desc = could not find container \"72a6ab3623b7fdee08a936b0e3a1da8c3f30817d1208104552cb10021c5a47c2\": container with ID starting with 72a6ab3623b7fdee08a936b0e3a1da8c3f30817d1208104552cb10021c5a47c2 not found: ID does not exist" Mar 18 13:11:19 crc kubenswrapper[4975]: I0318 13:11:19.027376 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6313bb47-eb74-4f25-917d-630031dc93e4" path="/var/lib/kubelet/pods/6313bb47-eb74-4f25-917d-630031dc93e4/volumes" Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.687470 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9khqk"] Mar 18 13:11:21 crc kubenswrapper[4975]: E0318 13:11:21.689094 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6313bb47-eb74-4f25-917d-630031dc93e4" containerName="extract-content" Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.689118 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="6313bb47-eb74-4f25-917d-630031dc93e4" containerName="extract-content" Mar 18 13:11:21 crc kubenswrapper[4975]: E0318 13:11:21.689165 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6313bb47-eb74-4f25-917d-630031dc93e4" containerName="extract-utilities" Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.689177 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="6313bb47-eb74-4f25-917d-630031dc93e4" containerName="extract-utilities" Mar 18 13:11:21 crc kubenswrapper[4975]: E0318 13:11:21.689205 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6313bb47-eb74-4f25-917d-630031dc93e4" containerName="registry-server" Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.689244 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="6313bb47-eb74-4f25-917d-630031dc93e4" containerName="registry-server" Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.689720 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="6313bb47-eb74-4f25-917d-630031dc93e4" containerName="registry-server" Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.693122 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.721913 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9khqk"] Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.724738 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-utilities\") pod \"redhat-marketplace-9khqk\" (UID: \"7b1aa5e7-73fc-4184-9df9-8c5bd9973469\") " pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.724777 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9ljc\" (UniqueName: \"kubernetes.io/projected/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-kube-api-access-b9ljc\") pod \"redhat-marketplace-9khqk\" (UID: \"7b1aa5e7-73fc-4184-9df9-8c5bd9973469\") " pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.724910 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-catalog-content\") pod \"redhat-marketplace-9khqk\" (UID: \"7b1aa5e7-73fc-4184-9df9-8c5bd9973469\") " pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.826466 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-catalog-content\") pod \"redhat-marketplace-9khqk\" (UID: \"7b1aa5e7-73fc-4184-9df9-8c5bd9973469\") " pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.826803 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-utilities\") pod \"redhat-marketplace-9khqk\" (UID: \"7b1aa5e7-73fc-4184-9df9-8c5bd9973469\") " pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.827003 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9ljc\" (UniqueName: \"kubernetes.io/projected/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-kube-api-access-b9ljc\") pod \"redhat-marketplace-9khqk\" (UID: \"7b1aa5e7-73fc-4184-9df9-8c5bd9973469\") " pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.827227 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-catalog-content\") pod \"redhat-marketplace-9khqk\" (UID: \"7b1aa5e7-73fc-4184-9df9-8c5bd9973469\") " pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.827320 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-utilities\") pod \"redhat-marketplace-9khqk\" (UID: \"7b1aa5e7-73fc-4184-9df9-8c5bd9973469\") " pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:21 crc kubenswrapper[4975]: I0318 13:11:21.846589 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9ljc\" (UniqueName: \"kubernetes.io/projected/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-kube-api-access-b9ljc\") pod \"redhat-marketplace-9khqk\" (UID: \"7b1aa5e7-73fc-4184-9df9-8c5bd9973469\") " pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:22 crc kubenswrapper[4975]: I0318 13:11:22.052485 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:22 crc kubenswrapper[4975]: I0318 13:11:22.595205 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9khqk"] Mar 18 13:11:22 crc kubenswrapper[4975]: I0318 13:11:22.627302 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9khqk" event={"ID":"7b1aa5e7-73fc-4184-9df9-8c5bd9973469","Type":"ContainerStarted","Data":"947a8637ba1dfd0ef7dcf1cf844f9b60bed2b1f98e5f5e03973a06504b7e07b7"} Mar 18 13:11:23 crc kubenswrapper[4975]: I0318 13:11:23.639036 4975 generic.go:334] "Generic (PLEG): container finished" podID="7b1aa5e7-73fc-4184-9df9-8c5bd9973469" containerID="acdbd9e8431775edb13080778791a7d48df0ecf26b33c7dc74eb8773225eec17" exitCode=0 Mar 18 13:11:23 crc kubenswrapper[4975]: I0318 13:11:23.639127 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9khqk" event={"ID":"7b1aa5e7-73fc-4184-9df9-8c5bd9973469","Type":"ContainerDied","Data":"acdbd9e8431775edb13080778791a7d48df0ecf26b33c7dc74eb8773225eec17"} Mar 18 13:11:24 crc kubenswrapper[4975]: I0318 13:11:24.648753 4975 generic.go:334] "Generic (PLEG): container finished" podID="7b1aa5e7-73fc-4184-9df9-8c5bd9973469" containerID="70efcc4302fc0d7d96781f6ba1c72abb352c93793cae3e02291d20b7eab136e5" exitCode=0 Mar 18 13:11:24 crc kubenswrapper[4975]: I0318 13:11:24.648822 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9khqk" event={"ID":"7b1aa5e7-73fc-4184-9df9-8c5bd9973469","Type":"ContainerDied","Data":"70efcc4302fc0d7d96781f6ba1c72abb352c93793cae3e02291d20b7eab136e5"} Mar 18 13:11:25 crc kubenswrapper[4975]: I0318 13:11:25.022059 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:11:25 crc kubenswrapper[4975]: E0318 13:11:25.022323 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:11:25 crc kubenswrapper[4975]: I0318 13:11:25.672246 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9khqk" event={"ID":"7b1aa5e7-73fc-4184-9df9-8c5bd9973469","Type":"ContainerStarted","Data":"13a73144d2247e42b23196d5d533c068e2de309489f3f55623c602f1580be0ce"} Mar 18 13:11:25 crc kubenswrapper[4975]: I0318 13:11:25.696783 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9khqk" podStartSLOduration=3.272767066 podStartE2EDuration="4.696767721s" podCreationTimestamp="2026-03-18 13:11:21 +0000 UTC" firstStartedPulling="2026-03-18 13:11:23.641273455 +0000 UTC m=+3669.355674034" lastFinishedPulling="2026-03-18 13:11:25.06527411 +0000 UTC m=+3670.779674689" observedRunningTime="2026-03-18 13:11:25.694137379 +0000 UTC m=+3671.408537958" watchObservedRunningTime="2026-03-18 13:11:25.696767721 +0000 UTC m=+3671.411168300" Mar 18 13:11:32 crc kubenswrapper[4975]: I0318 13:11:32.053194 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:32 crc kubenswrapper[4975]: I0318 13:11:32.053720 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:32 crc kubenswrapper[4975]: I0318 13:11:32.108348 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:32 crc kubenswrapper[4975]: I0318 13:11:32.800959 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:32 crc kubenswrapper[4975]: I0318 13:11:32.846858 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9khqk"] Mar 18 13:11:34 crc kubenswrapper[4975]: I0318 13:11:34.757499 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9khqk" podUID="7b1aa5e7-73fc-4184-9df9-8c5bd9973469" containerName="registry-server" containerID="cri-o://13a73144d2247e42b23196d5d533c068e2de309489f3f55623c602f1580be0ce" gracePeriod=2 Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.210637 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.300858 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-catalog-content\") pod \"7b1aa5e7-73fc-4184-9df9-8c5bd9973469\" (UID: \"7b1aa5e7-73fc-4184-9df9-8c5bd9973469\") " Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.300973 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9ljc\" (UniqueName: \"kubernetes.io/projected/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-kube-api-access-b9ljc\") pod \"7b1aa5e7-73fc-4184-9df9-8c5bd9973469\" (UID: \"7b1aa5e7-73fc-4184-9df9-8c5bd9973469\") " Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.301068 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-utilities\") pod \"7b1aa5e7-73fc-4184-9df9-8c5bd9973469\" (UID: \"7b1aa5e7-73fc-4184-9df9-8c5bd9973469\") " Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.302656 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-utilities" (OuterVolumeSpecName: "utilities") pod "7b1aa5e7-73fc-4184-9df9-8c5bd9973469" (UID: "7b1aa5e7-73fc-4184-9df9-8c5bd9973469"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.307961 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-kube-api-access-b9ljc" (OuterVolumeSpecName: "kube-api-access-b9ljc") pod "7b1aa5e7-73fc-4184-9df9-8c5bd9973469" (UID: "7b1aa5e7-73fc-4184-9df9-8c5bd9973469"). InnerVolumeSpecName "kube-api-access-b9ljc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.341053 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b1aa5e7-73fc-4184-9df9-8c5bd9973469" (UID: "7b1aa5e7-73fc-4184-9df9-8c5bd9973469"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.404104 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.404156 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9ljc\" (UniqueName: \"kubernetes.io/projected/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-kube-api-access-b9ljc\") on node \"crc\" DevicePath \"\"" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.404177 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b1aa5e7-73fc-4184-9df9-8c5bd9973469-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.768589 4975 generic.go:334] "Generic (PLEG): container finished" podID="7b1aa5e7-73fc-4184-9df9-8c5bd9973469" containerID="13a73144d2247e42b23196d5d533c068e2de309489f3f55623c602f1580be0ce" exitCode=0 Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.768669 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9khqk" event={"ID":"7b1aa5e7-73fc-4184-9df9-8c5bd9973469","Type":"ContainerDied","Data":"13a73144d2247e42b23196d5d533c068e2de309489f3f55623c602f1580be0ce"} Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.768709 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9khqk" event={"ID":"7b1aa5e7-73fc-4184-9df9-8c5bd9973469","Type":"ContainerDied","Data":"947a8637ba1dfd0ef7dcf1cf844f9b60bed2b1f98e5f5e03973a06504b7e07b7"} Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.768716 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9khqk" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.768750 4975 scope.go:117] "RemoveContainer" containerID="13a73144d2247e42b23196d5d533c068e2de309489f3f55623c602f1580be0ce" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.792427 4975 scope.go:117] "RemoveContainer" containerID="70efcc4302fc0d7d96781f6ba1c72abb352c93793cae3e02291d20b7eab136e5" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.808696 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9khqk"] Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.816152 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9khqk"] Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.824701 4975 scope.go:117] "RemoveContainer" containerID="acdbd9e8431775edb13080778791a7d48df0ecf26b33c7dc74eb8773225eec17" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.854098 4975 scope.go:117] "RemoveContainer" containerID="13a73144d2247e42b23196d5d533c068e2de309489f3f55623c602f1580be0ce" Mar 18 13:11:35 crc kubenswrapper[4975]: E0318 13:11:35.854490 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a73144d2247e42b23196d5d533c068e2de309489f3f55623c602f1580be0ce\": container with ID starting with 13a73144d2247e42b23196d5d533c068e2de309489f3f55623c602f1580be0ce not found: ID does not exist" containerID="13a73144d2247e42b23196d5d533c068e2de309489f3f55623c602f1580be0ce" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.854527 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a73144d2247e42b23196d5d533c068e2de309489f3f55623c602f1580be0ce"} err="failed to get container status \"13a73144d2247e42b23196d5d533c068e2de309489f3f55623c602f1580be0ce\": rpc error: code = NotFound desc = could not find container \"13a73144d2247e42b23196d5d533c068e2de309489f3f55623c602f1580be0ce\": container with ID starting with 13a73144d2247e42b23196d5d533c068e2de309489f3f55623c602f1580be0ce not found: ID does not exist" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.854547 4975 scope.go:117] "RemoveContainer" containerID="70efcc4302fc0d7d96781f6ba1c72abb352c93793cae3e02291d20b7eab136e5" Mar 18 13:11:35 crc kubenswrapper[4975]: E0318 13:11:35.854955 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70efcc4302fc0d7d96781f6ba1c72abb352c93793cae3e02291d20b7eab136e5\": container with ID starting with 70efcc4302fc0d7d96781f6ba1c72abb352c93793cae3e02291d20b7eab136e5 not found: ID does not exist" containerID="70efcc4302fc0d7d96781f6ba1c72abb352c93793cae3e02291d20b7eab136e5" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.855008 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70efcc4302fc0d7d96781f6ba1c72abb352c93793cae3e02291d20b7eab136e5"} err="failed to get container status \"70efcc4302fc0d7d96781f6ba1c72abb352c93793cae3e02291d20b7eab136e5\": rpc error: code = NotFound desc = could not find container \"70efcc4302fc0d7d96781f6ba1c72abb352c93793cae3e02291d20b7eab136e5\": container with ID starting with 70efcc4302fc0d7d96781f6ba1c72abb352c93793cae3e02291d20b7eab136e5 not found: ID does not exist" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.855042 4975 scope.go:117] "RemoveContainer" containerID="acdbd9e8431775edb13080778791a7d48df0ecf26b33c7dc74eb8773225eec17" Mar 18 13:11:35 crc kubenswrapper[4975]: E0318 13:11:35.855343 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acdbd9e8431775edb13080778791a7d48df0ecf26b33c7dc74eb8773225eec17\": container with ID starting with acdbd9e8431775edb13080778791a7d48df0ecf26b33c7dc74eb8773225eec17 not found: ID does not exist" containerID="acdbd9e8431775edb13080778791a7d48df0ecf26b33c7dc74eb8773225eec17" Mar 18 13:11:35 crc kubenswrapper[4975]: I0318 13:11:35.855371 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acdbd9e8431775edb13080778791a7d48df0ecf26b33c7dc74eb8773225eec17"} err="failed to get container status \"acdbd9e8431775edb13080778791a7d48df0ecf26b33c7dc74eb8773225eec17\": rpc error: code = NotFound desc = could not find container \"acdbd9e8431775edb13080778791a7d48df0ecf26b33c7dc74eb8773225eec17\": container with ID starting with acdbd9e8431775edb13080778791a7d48df0ecf26b33c7dc74eb8773225eec17 not found: ID does not exist" Mar 18 13:11:37 crc kubenswrapper[4975]: I0318 13:11:37.026663 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b1aa5e7-73fc-4184-9df9-8c5bd9973469" path="/var/lib/kubelet/pods/7b1aa5e7-73fc-4184-9df9-8c5bd9973469/volumes" Mar 18 13:11:39 crc kubenswrapper[4975]: I0318 13:11:39.016598 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:11:39 crc kubenswrapper[4975]: E0318 13:11:39.017208 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:11:50 crc kubenswrapper[4975]: I0318 13:11:50.018209 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:11:50 crc kubenswrapper[4975]: E0318 13:11:50.019612 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:12:00 crc kubenswrapper[4975]: I0318 13:12:00.148920 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563992-7q8vw"] Mar 18 13:12:00 crc kubenswrapper[4975]: E0318 13:12:00.150909 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b1aa5e7-73fc-4184-9df9-8c5bd9973469" containerName="extract-content" Mar 18 13:12:00 crc kubenswrapper[4975]: I0318 13:12:00.150932 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1aa5e7-73fc-4184-9df9-8c5bd9973469" containerName="extract-content" Mar 18 13:12:00 crc kubenswrapper[4975]: E0318 13:12:00.150949 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b1aa5e7-73fc-4184-9df9-8c5bd9973469" containerName="registry-server" Mar 18 13:12:00 crc kubenswrapper[4975]: I0318 13:12:00.150956 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1aa5e7-73fc-4184-9df9-8c5bd9973469" containerName="registry-server" Mar 18 13:12:00 crc kubenswrapper[4975]: E0318 13:12:00.150971 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b1aa5e7-73fc-4184-9df9-8c5bd9973469" containerName="extract-utilities" Mar 18 13:12:00 crc kubenswrapper[4975]: I0318 13:12:00.150980 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1aa5e7-73fc-4184-9df9-8c5bd9973469" containerName="extract-utilities" Mar 18 13:12:00 crc kubenswrapper[4975]: I0318 13:12:00.151196 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b1aa5e7-73fc-4184-9df9-8c5bd9973469" containerName="registry-server" Mar 18 13:12:00 crc kubenswrapper[4975]: I0318 13:12:00.151888 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-7q8vw" Mar 18 13:12:00 crc kubenswrapper[4975]: I0318 13:12:00.153963 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:12:00 crc kubenswrapper[4975]: I0318 13:12:00.154504 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:12:00 crc kubenswrapper[4975]: I0318 13:12:00.154643 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:12:00 crc kubenswrapper[4975]: I0318 13:12:00.161342 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-7q8vw"] Mar 18 13:12:00 crc kubenswrapper[4975]: I0318 13:12:00.257845 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzjtt\" (UniqueName: \"kubernetes.io/projected/d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9-kube-api-access-qzjtt\") pod \"auto-csr-approver-29563992-7q8vw\" (UID: \"d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9\") " pod="openshift-infra/auto-csr-approver-29563992-7q8vw" Mar 18 13:12:00 crc kubenswrapper[4975]: I0318 13:12:00.360547 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzjtt\" (UniqueName: \"kubernetes.io/projected/d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9-kube-api-access-qzjtt\") pod \"auto-csr-approver-29563992-7q8vw\" (UID: \"d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9\") " pod="openshift-infra/auto-csr-approver-29563992-7q8vw" Mar 18 13:12:00 crc kubenswrapper[4975]: I0318 13:12:00.378709 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzjtt\" (UniqueName: \"kubernetes.io/projected/d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9-kube-api-access-qzjtt\") pod \"auto-csr-approver-29563992-7q8vw\" (UID: \"d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9\") " pod="openshift-infra/auto-csr-approver-29563992-7q8vw" Mar 18 13:12:00 crc kubenswrapper[4975]: I0318 13:12:00.472252 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-7q8vw" Mar 18 13:12:00 crc kubenswrapper[4975]: I0318 13:12:00.911899 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-7q8vw"] Mar 18 13:12:01 crc kubenswrapper[4975]: I0318 13:12:01.025412 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563992-7q8vw" event={"ID":"d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9","Type":"ContainerStarted","Data":"74ec87f70b1898bdbbb0aaffdc3e6feb31c35297a339e95b33745c8ad7149925"} Mar 18 13:12:03 crc kubenswrapper[4975]: I0318 13:12:03.040849 4975 generic.go:334] "Generic (PLEG): container finished" podID="d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9" containerID="523c2dba508106ee79fccf6321b9ce35d52b0de61bc216862c083bb99a04facf" exitCode=0 Mar 18 13:12:03 crc kubenswrapper[4975]: I0318 13:12:03.041211 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563992-7q8vw" event={"ID":"d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9","Type":"ContainerDied","Data":"523c2dba508106ee79fccf6321b9ce35d52b0de61bc216862c083bb99a04facf"} Mar 18 13:12:04 crc kubenswrapper[4975]: I0318 13:12:04.016561 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:12:04 crc kubenswrapper[4975]: E0318 13:12:04.017276 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:12:04 crc kubenswrapper[4975]: I0318 13:12:04.434589 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-7q8vw" Mar 18 13:12:04 crc kubenswrapper[4975]: I0318 13:12:04.566047 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzjtt\" (UniqueName: \"kubernetes.io/projected/d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9-kube-api-access-qzjtt\") pod \"d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9\" (UID: \"d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9\") " Mar 18 13:12:04 crc kubenswrapper[4975]: I0318 13:12:04.574077 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9-kube-api-access-qzjtt" (OuterVolumeSpecName: "kube-api-access-qzjtt") pod "d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9" (UID: "d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9"). InnerVolumeSpecName "kube-api-access-qzjtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:12:04 crc kubenswrapper[4975]: I0318 13:12:04.669403 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzjtt\" (UniqueName: \"kubernetes.io/projected/d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9-kube-api-access-qzjtt\") on node \"crc\" DevicePath \"\"" Mar 18 13:12:05 crc kubenswrapper[4975]: I0318 13:12:05.074655 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563992-7q8vw" event={"ID":"d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9","Type":"ContainerDied","Data":"74ec87f70b1898bdbbb0aaffdc3e6feb31c35297a339e95b33745c8ad7149925"} Mar 18 13:12:05 crc kubenswrapper[4975]: I0318 13:12:05.074705 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74ec87f70b1898bdbbb0aaffdc3e6feb31c35297a339e95b33745c8ad7149925" Mar 18 13:12:05 crc kubenswrapper[4975]: I0318 13:12:05.074749 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563992-7q8vw" Mar 18 13:12:05 crc kubenswrapper[4975]: I0318 13:12:05.508377 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-279c5"] Mar 18 13:12:05 crc kubenswrapper[4975]: I0318 13:12:05.516208 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563986-279c5"] Mar 18 13:12:07 crc kubenswrapper[4975]: I0318 13:12:07.033580 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb" path="/var/lib/kubelet/pods/0e9fa60c-b743-4dc4-a44f-a5a8a3f290cb/volumes" Mar 18 13:12:15 crc kubenswrapper[4975]: I0318 13:12:15.025409 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:12:15 crc kubenswrapper[4975]: E0318 13:12:15.026382 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.152361 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8rjf2"] Mar 18 13:12:29 crc kubenswrapper[4975]: E0318 13:12:29.153443 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9" containerName="oc" Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.153466 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9" containerName="oc" Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.153765 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9" containerName="oc" Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.155467 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.171677 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rjf2"] Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.201407 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/293fea73-46a5-40aa-9e09-85abd80ef683-utilities\") pod \"certified-operators-8rjf2\" (UID: \"293fea73-46a5-40aa-9e09-85abd80ef683\") " pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.201473 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/293fea73-46a5-40aa-9e09-85abd80ef683-catalog-content\") pod \"certified-operators-8rjf2\" (UID: \"293fea73-46a5-40aa-9e09-85abd80ef683\") " pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.201510 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzzp8\" (UniqueName: \"kubernetes.io/projected/293fea73-46a5-40aa-9e09-85abd80ef683-kube-api-access-tzzp8\") pod \"certified-operators-8rjf2\" (UID: \"293fea73-46a5-40aa-9e09-85abd80ef683\") " pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.303306 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/293fea73-46a5-40aa-9e09-85abd80ef683-utilities\") pod \"certified-operators-8rjf2\" (UID: \"293fea73-46a5-40aa-9e09-85abd80ef683\") " pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.303360 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/293fea73-46a5-40aa-9e09-85abd80ef683-catalog-content\") pod \"certified-operators-8rjf2\" (UID: \"293fea73-46a5-40aa-9e09-85abd80ef683\") " pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.303393 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzzp8\" (UniqueName: \"kubernetes.io/projected/293fea73-46a5-40aa-9e09-85abd80ef683-kube-api-access-tzzp8\") pod \"certified-operators-8rjf2\" (UID: \"293fea73-46a5-40aa-9e09-85abd80ef683\") " pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.303791 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/293fea73-46a5-40aa-9e09-85abd80ef683-utilities\") pod \"certified-operators-8rjf2\" (UID: \"293fea73-46a5-40aa-9e09-85abd80ef683\") " pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.304001 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/293fea73-46a5-40aa-9e09-85abd80ef683-catalog-content\") pod \"certified-operators-8rjf2\" (UID: \"293fea73-46a5-40aa-9e09-85abd80ef683\") " pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.333096 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzzp8\" (UniqueName: \"kubernetes.io/projected/293fea73-46a5-40aa-9e09-85abd80ef683-kube-api-access-tzzp8\") pod \"certified-operators-8rjf2\" (UID: \"293fea73-46a5-40aa-9e09-85abd80ef683\") " pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.483763 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:29 crc kubenswrapper[4975]: I0318 13:12:29.971732 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rjf2"] Mar 18 13:12:30 crc kubenswrapper[4975]: I0318 13:12:30.017166 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:12:30 crc kubenswrapper[4975]: E0318 13:12:30.017457 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:12:30 crc kubenswrapper[4975]: I0318 13:12:30.312038 4975 generic.go:334] "Generic (PLEG): container finished" podID="293fea73-46a5-40aa-9e09-85abd80ef683" containerID="4874a12b4e99222c41659bc6582d5411483e9a03c0fd62463bf7d3a8f47a7b36" exitCode=0 Mar 18 13:12:30 crc kubenswrapper[4975]: I0318 13:12:30.312131 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rjf2" event={"ID":"293fea73-46a5-40aa-9e09-85abd80ef683","Type":"ContainerDied","Data":"4874a12b4e99222c41659bc6582d5411483e9a03c0fd62463bf7d3a8f47a7b36"} Mar 18 13:12:30 crc kubenswrapper[4975]: I0318 13:12:30.312183 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rjf2" event={"ID":"293fea73-46a5-40aa-9e09-85abd80ef683","Type":"ContainerStarted","Data":"e094283bef274d1b812e6e1a17d331cd67bb0a171efa3a89485df9c183c9be2a"} Mar 18 13:12:31 crc kubenswrapper[4975]: I0318 13:12:31.323312 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rjf2" event={"ID":"293fea73-46a5-40aa-9e09-85abd80ef683","Type":"ContainerStarted","Data":"07d58e19a6a4f0602c048fe4a821da9bab9721ca07d1eeac21cb61140289f906"} Mar 18 13:12:32 crc kubenswrapper[4975]: I0318 13:12:32.336449 4975 generic.go:334] "Generic (PLEG): container finished" podID="293fea73-46a5-40aa-9e09-85abd80ef683" containerID="07d58e19a6a4f0602c048fe4a821da9bab9721ca07d1eeac21cb61140289f906" exitCode=0 Mar 18 13:12:32 crc kubenswrapper[4975]: I0318 13:12:32.336744 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rjf2" event={"ID":"293fea73-46a5-40aa-9e09-85abd80ef683","Type":"ContainerDied","Data":"07d58e19a6a4f0602c048fe4a821da9bab9721ca07d1eeac21cb61140289f906"} Mar 18 13:12:33 crc kubenswrapper[4975]: I0318 13:12:33.345089 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rjf2" event={"ID":"293fea73-46a5-40aa-9e09-85abd80ef683","Type":"ContainerStarted","Data":"c5a6b97e5a3e0d8b17a154c410d01cf1edd2af8d76bf19bef4f43a1e47f53b3a"} Mar 18 13:12:33 crc kubenswrapper[4975]: I0318 13:12:33.371604 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8rjf2" podStartSLOduration=1.925155985 podStartE2EDuration="4.371558815s" podCreationTimestamp="2026-03-18 13:12:29 +0000 UTC" firstStartedPulling="2026-03-18 13:12:30.31484599 +0000 UTC m=+3736.029246609" lastFinishedPulling="2026-03-18 13:12:32.76124886 +0000 UTC m=+3738.475649439" observedRunningTime="2026-03-18 13:12:33.363227198 +0000 UTC m=+3739.077627777" watchObservedRunningTime="2026-03-18 13:12:33.371558815 +0000 UTC m=+3739.085959394" Mar 18 13:12:39 crc kubenswrapper[4975]: I0318 13:12:39.484690 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:39 crc kubenswrapper[4975]: I0318 13:12:39.485335 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:39 crc kubenswrapper[4975]: I0318 13:12:39.556578 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:40 crc kubenswrapper[4975]: I0318 13:12:40.466591 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:40 crc kubenswrapper[4975]: I0318 13:12:40.518493 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rjf2"] Mar 18 13:12:42 crc kubenswrapper[4975]: I0318 13:12:42.440536 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8rjf2" podUID="293fea73-46a5-40aa-9e09-85abd80ef683" containerName="registry-server" containerID="cri-o://c5a6b97e5a3e0d8b17a154c410d01cf1edd2af8d76bf19bef4f43a1e47f53b3a" gracePeriod=2 Mar 18 13:12:42 crc kubenswrapper[4975]: I0318 13:12:42.891633 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.069442 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/293fea73-46a5-40aa-9e09-85abd80ef683-utilities\") pod \"293fea73-46a5-40aa-9e09-85abd80ef683\" (UID: \"293fea73-46a5-40aa-9e09-85abd80ef683\") " Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.069509 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzzp8\" (UniqueName: \"kubernetes.io/projected/293fea73-46a5-40aa-9e09-85abd80ef683-kube-api-access-tzzp8\") pod \"293fea73-46a5-40aa-9e09-85abd80ef683\" (UID: \"293fea73-46a5-40aa-9e09-85abd80ef683\") " Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.069547 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/293fea73-46a5-40aa-9e09-85abd80ef683-catalog-content\") pod \"293fea73-46a5-40aa-9e09-85abd80ef683\" (UID: \"293fea73-46a5-40aa-9e09-85abd80ef683\") " Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.071475 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/293fea73-46a5-40aa-9e09-85abd80ef683-utilities" (OuterVolumeSpecName: "utilities") pod "293fea73-46a5-40aa-9e09-85abd80ef683" (UID: "293fea73-46a5-40aa-9e09-85abd80ef683"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.075073 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/293fea73-46a5-40aa-9e09-85abd80ef683-kube-api-access-tzzp8" (OuterVolumeSpecName: "kube-api-access-tzzp8") pod "293fea73-46a5-40aa-9e09-85abd80ef683" (UID: "293fea73-46a5-40aa-9e09-85abd80ef683"). InnerVolumeSpecName "kube-api-access-tzzp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.137248 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/293fea73-46a5-40aa-9e09-85abd80ef683-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "293fea73-46a5-40aa-9e09-85abd80ef683" (UID: "293fea73-46a5-40aa-9e09-85abd80ef683"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.171414 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/293fea73-46a5-40aa-9e09-85abd80ef683-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.171443 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzzp8\" (UniqueName: \"kubernetes.io/projected/293fea73-46a5-40aa-9e09-85abd80ef683-kube-api-access-tzzp8\") on node \"crc\" DevicePath \"\"" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.171453 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/293fea73-46a5-40aa-9e09-85abd80ef683-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.456374 4975 generic.go:334] "Generic (PLEG): container finished" podID="293fea73-46a5-40aa-9e09-85abd80ef683" containerID="c5a6b97e5a3e0d8b17a154c410d01cf1edd2af8d76bf19bef4f43a1e47f53b3a" exitCode=0 Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.456427 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rjf2" event={"ID":"293fea73-46a5-40aa-9e09-85abd80ef683","Type":"ContainerDied","Data":"c5a6b97e5a3e0d8b17a154c410d01cf1edd2af8d76bf19bef4f43a1e47f53b3a"} Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.456745 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rjf2" event={"ID":"293fea73-46a5-40aa-9e09-85abd80ef683","Type":"ContainerDied","Data":"e094283bef274d1b812e6e1a17d331cd67bb0a171efa3a89485df9c183c9be2a"} Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.456481 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rjf2" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.456770 4975 scope.go:117] "RemoveContainer" containerID="c5a6b97e5a3e0d8b17a154c410d01cf1edd2af8d76bf19bef4f43a1e47f53b3a" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.491732 4975 scope.go:117] "RemoveContainer" containerID="07d58e19a6a4f0602c048fe4a821da9bab9721ca07d1eeac21cb61140289f906" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.495044 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rjf2"] Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.505558 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8rjf2"] Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.513509 4975 scope.go:117] "RemoveContainer" containerID="4874a12b4e99222c41659bc6582d5411483e9a03c0fd62463bf7d3a8f47a7b36" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.571141 4975 scope.go:117] "RemoveContainer" containerID="c5a6b97e5a3e0d8b17a154c410d01cf1edd2af8d76bf19bef4f43a1e47f53b3a" Mar 18 13:12:43 crc kubenswrapper[4975]: E0318 13:12:43.572322 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5a6b97e5a3e0d8b17a154c410d01cf1edd2af8d76bf19bef4f43a1e47f53b3a\": container with ID starting with c5a6b97e5a3e0d8b17a154c410d01cf1edd2af8d76bf19bef4f43a1e47f53b3a not found: ID does not exist" containerID="c5a6b97e5a3e0d8b17a154c410d01cf1edd2af8d76bf19bef4f43a1e47f53b3a" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.572380 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5a6b97e5a3e0d8b17a154c410d01cf1edd2af8d76bf19bef4f43a1e47f53b3a"} err="failed to get container status \"c5a6b97e5a3e0d8b17a154c410d01cf1edd2af8d76bf19bef4f43a1e47f53b3a\": rpc error: code = NotFound desc = could not find container \"c5a6b97e5a3e0d8b17a154c410d01cf1edd2af8d76bf19bef4f43a1e47f53b3a\": container with ID starting with c5a6b97e5a3e0d8b17a154c410d01cf1edd2af8d76bf19bef4f43a1e47f53b3a not found: ID does not exist" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.572413 4975 scope.go:117] "RemoveContainer" containerID="07d58e19a6a4f0602c048fe4a821da9bab9721ca07d1eeac21cb61140289f906" Mar 18 13:12:43 crc kubenswrapper[4975]: E0318 13:12:43.572784 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07d58e19a6a4f0602c048fe4a821da9bab9721ca07d1eeac21cb61140289f906\": container with ID starting with 07d58e19a6a4f0602c048fe4a821da9bab9721ca07d1eeac21cb61140289f906 not found: ID does not exist" containerID="07d58e19a6a4f0602c048fe4a821da9bab9721ca07d1eeac21cb61140289f906" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.572813 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07d58e19a6a4f0602c048fe4a821da9bab9721ca07d1eeac21cb61140289f906"} err="failed to get container status \"07d58e19a6a4f0602c048fe4a821da9bab9721ca07d1eeac21cb61140289f906\": rpc error: code = NotFound desc = could not find container \"07d58e19a6a4f0602c048fe4a821da9bab9721ca07d1eeac21cb61140289f906\": container with ID starting with 07d58e19a6a4f0602c048fe4a821da9bab9721ca07d1eeac21cb61140289f906 not found: ID does not exist" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.572837 4975 scope.go:117] "RemoveContainer" containerID="4874a12b4e99222c41659bc6582d5411483e9a03c0fd62463bf7d3a8f47a7b36" Mar 18 13:12:43 crc kubenswrapper[4975]: E0318 13:12:43.573320 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4874a12b4e99222c41659bc6582d5411483e9a03c0fd62463bf7d3a8f47a7b36\": container with ID starting with 4874a12b4e99222c41659bc6582d5411483e9a03c0fd62463bf7d3a8f47a7b36 not found: ID does not exist" containerID="4874a12b4e99222c41659bc6582d5411483e9a03c0fd62463bf7d3a8f47a7b36" Mar 18 13:12:43 crc kubenswrapper[4975]: I0318 13:12:43.573348 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4874a12b4e99222c41659bc6582d5411483e9a03c0fd62463bf7d3a8f47a7b36"} err="failed to get container status \"4874a12b4e99222c41659bc6582d5411483e9a03c0fd62463bf7d3a8f47a7b36\": rpc error: code = NotFound desc = could not find container \"4874a12b4e99222c41659bc6582d5411483e9a03c0fd62463bf7d3a8f47a7b36\": container with ID starting with 4874a12b4e99222c41659bc6582d5411483e9a03c0fd62463bf7d3a8f47a7b36 not found: ID does not exist" Mar 18 13:12:44 crc kubenswrapper[4975]: I0318 13:12:44.016953 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:12:44 crc kubenswrapper[4975]: E0318 13:12:44.017414 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:12:45 crc kubenswrapper[4975]: I0318 13:12:45.032859 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="293fea73-46a5-40aa-9e09-85abd80ef683" path="/var/lib/kubelet/pods/293fea73-46a5-40aa-9e09-85abd80ef683/volumes" Mar 18 13:12:49 crc kubenswrapper[4975]: I0318 13:12:49.047048 4975 scope.go:117] "RemoveContainer" containerID="24dc7b1ce1e1d1f71f6b730a952d2ee589b8a774d2df3ea9f292924cacd27abb" Mar 18 13:12:59 crc kubenswrapper[4975]: I0318 13:12:59.016972 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:12:59 crc kubenswrapper[4975]: E0318 13:12:59.017685 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:13:13 crc kubenswrapper[4975]: I0318 13:13:13.016663 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:13:13 crc kubenswrapper[4975]: E0318 13:13:13.017327 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:13:27 crc kubenswrapper[4975]: I0318 13:13:27.016729 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:13:27 crc kubenswrapper[4975]: E0318 13:13:27.017685 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.356146 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h4c76"] Mar 18 13:13:33 crc kubenswrapper[4975]: E0318 13:13:33.357061 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293fea73-46a5-40aa-9e09-85abd80ef683" containerName="extract-content" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.357076 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="293fea73-46a5-40aa-9e09-85abd80ef683" containerName="extract-content" Mar 18 13:13:33 crc kubenswrapper[4975]: E0318 13:13:33.357093 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293fea73-46a5-40aa-9e09-85abd80ef683" containerName="registry-server" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.357099 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="293fea73-46a5-40aa-9e09-85abd80ef683" containerName="registry-server" Mar 18 13:13:33 crc kubenswrapper[4975]: E0318 13:13:33.357109 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293fea73-46a5-40aa-9e09-85abd80ef683" containerName="extract-utilities" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.357116 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="293fea73-46a5-40aa-9e09-85abd80ef683" containerName="extract-utilities" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.357300 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="293fea73-46a5-40aa-9e09-85abd80ef683" containerName="registry-server" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.358574 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.375233 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4c76"] Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.433973 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d04f088-05af-4368-be5d-e56750040800-catalog-content\") pod \"redhat-operators-h4c76\" (UID: \"7d04f088-05af-4368-be5d-e56750040800\") " pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.434370 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d04f088-05af-4368-be5d-e56750040800-utilities\") pod \"redhat-operators-h4c76\" (UID: \"7d04f088-05af-4368-be5d-e56750040800\") " pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.434435 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6m9s\" (UniqueName: \"kubernetes.io/projected/7d04f088-05af-4368-be5d-e56750040800-kube-api-access-g6m9s\") pod \"redhat-operators-h4c76\" (UID: \"7d04f088-05af-4368-be5d-e56750040800\") " pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.535889 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d04f088-05af-4368-be5d-e56750040800-catalog-content\") pod \"redhat-operators-h4c76\" (UID: \"7d04f088-05af-4368-be5d-e56750040800\") " pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.535988 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d04f088-05af-4368-be5d-e56750040800-utilities\") pod \"redhat-operators-h4c76\" (UID: \"7d04f088-05af-4368-be5d-e56750040800\") " pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.536015 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6m9s\" (UniqueName: \"kubernetes.io/projected/7d04f088-05af-4368-be5d-e56750040800-kube-api-access-g6m9s\") pod \"redhat-operators-h4c76\" (UID: \"7d04f088-05af-4368-be5d-e56750040800\") " pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.536544 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d04f088-05af-4368-be5d-e56750040800-catalog-content\") pod \"redhat-operators-h4c76\" (UID: \"7d04f088-05af-4368-be5d-e56750040800\") " pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.536682 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d04f088-05af-4368-be5d-e56750040800-utilities\") pod \"redhat-operators-h4c76\" (UID: \"7d04f088-05af-4368-be5d-e56750040800\") " pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.559462 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6m9s\" (UniqueName: \"kubernetes.io/projected/7d04f088-05af-4368-be5d-e56750040800-kube-api-access-g6m9s\") pod \"redhat-operators-h4c76\" (UID: \"7d04f088-05af-4368-be5d-e56750040800\") " pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:33 crc kubenswrapper[4975]: I0318 13:13:33.682434 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:34 crc kubenswrapper[4975]: I0318 13:13:34.163422 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4c76"] Mar 18 13:13:34 crc kubenswrapper[4975]: W0318 13:13:34.169335 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d04f088_05af_4368_be5d_e56750040800.slice/crio-88966a925b344c23ae545cde17f61574cc8f29982dd875638aa0797d07f90c0f WatchSource:0}: Error finding container 88966a925b344c23ae545cde17f61574cc8f29982dd875638aa0797d07f90c0f: Status 404 returned error can't find the container with id 88966a925b344c23ae545cde17f61574cc8f29982dd875638aa0797d07f90c0f Mar 18 13:13:34 crc kubenswrapper[4975]: I0318 13:13:34.260013 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4c76" event={"ID":"7d04f088-05af-4368-be5d-e56750040800","Type":"ContainerStarted","Data":"88966a925b344c23ae545cde17f61574cc8f29982dd875638aa0797d07f90c0f"} Mar 18 13:13:35 crc kubenswrapper[4975]: I0318 13:13:35.272377 4975 generic.go:334] "Generic (PLEG): container finished" podID="7d04f088-05af-4368-be5d-e56750040800" containerID="623f160404623a51e3fcdef7b3979161e47ec07a64d9a2e5c9263db105219bb6" exitCode=0 Mar 18 13:13:35 crc kubenswrapper[4975]: I0318 13:13:35.273966 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4c76" event={"ID":"7d04f088-05af-4368-be5d-e56750040800","Type":"ContainerDied","Data":"623f160404623a51e3fcdef7b3979161e47ec07a64d9a2e5c9263db105219bb6"} Mar 18 13:13:35 crc kubenswrapper[4975]: I0318 13:13:35.276469 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:13:36 crc kubenswrapper[4975]: I0318 13:13:36.283907 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4c76" event={"ID":"7d04f088-05af-4368-be5d-e56750040800","Type":"ContainerStarted","Data":"eafa01c74fbcc4e1f9d5ed9ea8b39e9bb6ff9b927415f84b8ec264832cdaf7de"} Mar 18 13:13:37 crc kubenswrapper[4975]: I0318 13:13:37.302357 4975 generic.go:334] "Generic (PLEG): container finished" podID="7d04f088-05af-4368-be5d-e56750040800" containerID="eafa01c74fbcc4e1f9d5ed9ea8b39e9bb6ff9b927415f84b8ec264832cdaf7de" exitCode=0 Mar 18 13:13:37 crc kubenswrapper[4975]: I0318 13:13:37.302406 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4c76" event={"ID":"7d04f088-05af-4368-be5d-e56750040800","Type":"ContainerDied","Data":"eafa01c74fbcc4e1f9d5ed9ea8b39e9bb6ff9b927415f84b8ec264832cdaf7de"} Mar 18 13:13:38 crc kubenswrapper[4975]: I0318 13:13:38.317638 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4c76" event={"ID":"7d04f088-05af-4368-be5d-e56750040800","Type":"ContainerStarted","Data":"9996ac57cfe2f829193f23dbcabb0cd034eaa2e8b6ed0e75a13bf65220b9777c"} Mar 18 13:13:38 crc kubenswrapper[4975]: I0318 13:13:38.343332 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h4c76" podStartSLOduration=2.8963562 podStartE2EDuration="5.343287724s" podCreationTimestamp="2026-03-18 13:13:33 +0000 UTC" firstStartedPulling="2026-03-18 13:13:35.276215578 +0000 UTC m=+3800.990616157" lastFinishedPulling="2026-03-18 13:13:37.723147102 +0000 UTC m=+3803.437547681" observedRunningTime="2026-03-18 13:13:38.334798263 +0000 UTC m=+3804.049198892" watchObservedRunningTime="2026-03-18 13:13:38.343287724 +0000 UTC m=+3804.057688303" Mar 18 13:13:39 crc kubenswrapper[4975]: I0318 13:13:39.016746 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:13:39 crc kubenswrapper[4975]: E0318 13:13:39.017142 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:13:43 crc kubenswrapper[4975]: I0318 13:13:43.683337 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:43 crc kubenswrapper[4975]: I0318 13:13:43.683784 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:44 crc kubenswrapper[4975]: I0318 13:13:44.736091 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h4c76" podUID="7d04f088-05af-4368-be5d-e56750040800" containerName="registry-server" probeResult="failure" output=< Mar 18 13:13:44 crc kubenswrapper[4975]: timeout: failed to connect service ":50051" within 1s Mar 18 13:13:44 crc kubenswrapper[4975]: > Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.035922 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm"] Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.039537 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.042395 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.042524 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.042568 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.042654 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-77rz6" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.044144 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.046236 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm"] Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.231189 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.231296 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn2bt\" (UniqueName: \"kubernetes.io/projected/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-kube-api-access-rn2bt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.231327 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.231410 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.231471 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.333142 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.333223 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn2bt\" (UniqueName: \"kubernetes.io/projected/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-kube-api-access-rn2bt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.333252 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.333289 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.333325 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.339091 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.339324 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.341691 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.347605 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.350646 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn2bt\" (UniqueName: \"kubernetes.io/projected/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-kube-api-access-rn2bt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.367553 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:13:48 crc kubenswrapper[4975]: I0318 13:13:48.892359 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm"] Mar 18 13:13:49 crc kubenswrapper[4975]: I0318 13:13:49.426022 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" event={"ID":"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492","Type":"ContainerStarted","Data":"74a241acb9d453851c684e3b4c5e5d2492ed13cc4b502f58151581d4a3c4f374"} Mar 18 13:13:49 crc kubenswrapper[4975]: I0318 13:13:49.427673 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" event={"ID":"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492","Type":"ContainerStarted","Data":"e85b998a0e79b474746e0e64f50f75208e5800b07245a3cbcb7d78ccd58e3e37"} Mar 18 13:13:49 crc kubenswrapper[4975]: I0318 13:13:49.451727 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" podStartSLOduration=1.264961228 podStartE2EDuration="1.451696442s" podCreationTimestamp="2026-03-18 13:13:48 +0000 UTC" firstStartedPulling="2026-03-18 13:13:48.894516914 +0000 UTC m=+3814.608917493" lastFinishedPulling="2026-03-18 13:13:49.081252128 +0000 UTC m=+3814.795652707" observedRunningTime="2026-03-18 13:13:49.444887397 +0000 UTC m=+3815.159287986" watchObservedRunningTime="2026-03-18 13:13:49.451696442 +0000 UTC m=+3815.166097011" Mar 18 13:13:53 crc kubenswrapper[4975]: I0318 13:13:53.016827 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:13:53 crc kubenswrapper[4975]: E0318 13:13:53.017728 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:13:53 crc kubenswrapper[4975]: I0318 13:13:53.744249 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:53 crc kubenswrapper[4975]: I0318 13:13:53.793718 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:53 crc kubenswrapper[4975]: I0318 13:13:53.982998 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4c76"] Mar 18 13:13:55 crc kubenswrapper[4975]: I0318 13:13:55.474609 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h4c76" podUID="7d04f088-05af-4368-be5d-e56750040800" containerName="registry-server" containerID="cri-o://9996ac57cfe2f829193f23dbcabb0cd034eaa2e8b6ed0e75a13bf65220b9777c" gracePeriod=2 Mar 18 13:13:55 crc kubenswrapper[4975]: I0318 13:13:55.889538 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.074648 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d04f088-05af-4368-be5d-e56750040800-utilities\") pod \"7d04f088-05af-4368-be5d-e56750040800\" (UID: \"7d04f088-05af-4368-be5d-e56750040800\") " Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.074959 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d04f088-05af-4368-be5d-e56750040800-catalog-content\") pod \"7d04f088-05af-4368-be5d-e56750040800\" (UID: \"7d04f088-05af-4368-be5d-e56750040800\") " Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.075535 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6m9s\" (UniqueName: \"kubernetes.io/projected/7d04f088-05af-4368-be5d-e56750040800-kube-api-access-g6m9s\") pod \"7d04f088-05af-4368-be5d-e56750040800\" (UID: \"7d04f088-05af-4368-be5d-e56750040800\") " Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.075655 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d04f088-05af-4368-be5d-e56750040800-utilities" (OuterVolumeSpecName: "utilities") pod "7d04f088-05af-4368-be5d-e56750040800" (UID: "7d04f088-05af-4368-be5d-e56750040800"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.076438 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d04f088-05af-4368-be5d-e56750040800-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.082002 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d04f088-05af-4368-be5d-e56750040800-kube-api-access-g6m9s" (OuterVolumeSpecName: "kube-api-access-g6m9s") pod "7d04f088-05af-4368-be5d-e56750040800" (UID: "7d04f088-05af-4368-be5d-e56750040800"). InnerVolumeSpecName "kube-api-access-g6m9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.179107 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6m9s\" (UniqueName: \"kubernetes.io/projected/7d04f088-05af-4368-be5d-e56750040800-kube-api-access-g6m9s\") on node \"crc\" DevicePath \"\"" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.225452 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d04f088-05af-4368-be5d-e56750040800-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d04f088-05af-4368-be5d-e56750040800" (UID: "7d04f088-05af-4368-be5d-e56750040800"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.280421 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d04f088-05af-4368-be5d-e56750040800-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.497802 4975 generic.go:334] "Generic (PLEG): container finished" podID="7d04f088-05af-4368-be5d-e56750040800" containerID="9996ac57cfe2f829193f23dbcabb0cd034eaa2e8b6ed0e75a13bf65220b9777c" exitCode=0 Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.497892 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4c76" event={"ID":"7d04f088-05af-4368-be5d-e56750040800","Type":"ContainerDied","Data":"9996ac57cfe2f829193f23dbcabb0cd034eaa2e8b6ed0e75a13bf65220b9777c"} Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.497940 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4c76" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.497961 4975 scope.go:117] "RemoveContainer" containerID="9996ac57cfe2f829193f23dbcabb0cd034eaa2e8b6ed0e75a13bf65220b9777c" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.497944 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4c76" event={"ID":"7d04f088-05af-4368-be5d-e56750040800","Type":"ContainerDied","Data":"88966a925b344c23ae545cde17f61574cc8f29982dd875638aa0797d07f90c0f"} Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.536296 4975 scope.go:117] "RemoveContainer" containerID="eafa01c74fbcc4e1f9d5ed9ea8b39e9bb6ff9b927415f84b8ec264832cdaf7de" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.539775 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4c76"] Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.549098 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h4c76"] Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.578018 4975 scope.go:117] "RemoveContainer" containerID="623f160404623a51e3fcdef7b3979161e47ec07a64d9a2e5c9263db105219bb6" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.605336 4975 scope.go:117] "RemoveContainer" containerID="9996ac57cfe2f829193f23dbcabb0cd034eaa2e8b6ed0e75a13bf65220b9777c" Mar 18 13:13:56 crc kubenswrapper[4975]: E0318 13:13:56.605784 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9996ac57cfe2f829193f23dbcabb0cd034eaa2e8b6ed0e75a13bf65220b9777c\": container with ID starting with 9996ac57cfe2f829193f23dbcabb0cd034eaa2e8b6ed0e75a13bf65220b9777c not found: ID does not exist" containerID="9996ac57cfe2f829193f23dbcabb0cd034eaa2e8b6ed0e75a13bf65220b9777c" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.605832 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9996ac57cfe2f829193f23dbcabb0cd034eaa2e8b6ed0e75a13bf65220b9777c"} err="failed to get container status \"9996ac57cfe2f829193f23dbcabb0cd034eaa2e8b6ed0e75a13bf65220b9777c\": rpc error: code = NotFound desc = could not find container \"9996ac57cfe2f829193f23dbcabb0cd034eaa2e8b6ed0e75a13bf65220b9777c\": container with ID starting with 9996ac57cfe2f829193f23dbcabb0cd034eaa2e8b6ed0e75a13bf65220b9777c not found: ID does not exist" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.605857 4975 scope.go:117] "RemoveContainer" containerID="eafa01c74fbcc4e1f9d5ed9ea8b39e9bb6ff9b927415f84b8ec264832cdaf7de" Mar 18 13:13:56 crc kubenswrapper[4975]: E0318 13:13:56.606470 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eafa01c74fbcc4e1f9d5ed9ea8b39e9bb6ff9b927415f84b8ec264832cdaf7de\": container with ID starting with eafa01c74fbcc4e1f9d5ed9ea8b39e9bb6ff9b927415f84b8ec264832cdaf7de not found: ID does not exist" containerID="eafa01c74fbcc4e1f9d5ed9ea8b39e9bb6ff9b927415f84b8ec264832cdaf7de" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.606500 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eafa01c74fbcc4e1f9d5ed9ea8b39e9bb6ff9b927415f84b8ec264832cdaf7de"} err="failed to get container status \"eafa01c74fbcc4e1f9d5ed9ea8b39e9bb6ff9b927415f84b8ec264832cdaf7de\": rpc error: code = NotFound desc = could not find container \"eafa01c74fbcc4e1f9d5ed9ea8b39e9bb6ff9b927415f84b8ec264832cdaf7de\": container with ID starting with eafa01c74fbcc4e1f9d5ed9ea8b39e9bb6ff9b927415f84b8ec264832cdaf7de not found: ID does not exist" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.606518 4975 scope.go:117] "RemoveContainer" containerID="623f160404623a51e3fcdef7b3979161e47ec07a64d9a2e5c9263db105219bb6" Mar 18 13:13:56 crc kubenswrapper[4975]: E0318 13:13:56.606757 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"623f160404623a51e3fcdef7b3979161e47ec07a64d9a2e5c9263db105219bb6\": container with ID starting with 623f160404623a51e3fcdef7b3979161e47ec07a64d9a2e5c9263db105219bb6 not found: ID does not exist" containerID="623f160404623a51e3fcdef7b3979161e47ec07a64d9a2e5c9263db105219bb6" Mar 18 13:13:56 crc kubenswrapper[4975]: I0318 13:13:56.606784 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"623f160404623a51e3fcdef7b3979161e47ec07a64d9a2e5c9263db105219bb6"} err="failed to get container status \"623f160404623a51e3fcdef7b3979161e47ec07a64d9a2e5c9263db105219bb6\": rpc error: code = NotFound desc = could not find container \"623f160404623a51e3fcdef7b3979161e47ec07a64d9a2e5c9263db105219bb6\": container with ID starting with 623f160404623a51e3fcdef7b3979161e47ec07a64d9a2e5c9263db105219bb6 not found: ID does not exist" Mar 18 13:13:57 crc kubenswrapper[4975]: I0318 13:13:57.034405 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d04f088-05af-4368-be5d-e56750040800" path="/var/lib/kubelet/pods/7d04f088-05af-4368-be5d-e56750040800/volumes" Mar 18 13:14:00 crc kubenswrapper[4975]: I0318 13:14:00.167546 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563994-p6gb2"] Mar 18 13:14:00 crc kubenswrapper[4975]: E0318 13:14:00.169975 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d04f088-05af-4368-be5d-e56750040800" containerName="extract-utilities" Mar 18 13:14:00 crc kubenswrapper[4975]: I0318 13:14:00.170151 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d04f088-05af-4368-be5d-e56750040800" containerName="extract-utilities" Mar 18 13:14:00 crc kubenswrapper[4975]: E0318 13:14:00.170408 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d04f088-05af-4368-be5d-e56750040800" containerName="extract-content" Mar 18 13:14:00 crc kubenswrapper[4975]: I0318 13:14:00.170541 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d04f088-05af-4368-be5d-e56750040800" containerName="extract-content" Mar 18 13:14:00 crc kubenswrapper[4975]: E0318 13:14:00.170715 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d04f088-05af-4368-be5d-e56750040800" containerName="registry-server" Mar 18 13:14:00 crc kubenswrapper[4975]: I0318 13:14:00.170833 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d04f088-05af-4368-be5d-e56750040800" containerName="registry-server" Mar 18 13:14:00 crc kubenswrapper[4975]: I0318 13:14:00.171337 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d04f088-05af-4368-be5d-e56750040800" containerName="registry-server" Mar 18 13:14:00 crc kubenswrapper[4975]: I0318 13:14:00.172531 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-p6gb2" Mar 18 13:14:00 crc kubenswrapper[4975]: I0318 13:14:00.175474 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:14:00 crc kubenswrapper[4975]: I0318 13:14:00.176792 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:14:00 crc kubenswrapper[4975]: I0318 13:14:00.176814 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:14:00 crc kubenswrapper[4975]: I0318 13:14:00.187631 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-p6gb2"] Mar 18 13:14:00 crc kubenswrapper[4975]: I0318 13:14:00.359556 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k648\" (UniqueName: \"kubernetes.io/projected/ace5d76e-1744-4242-99bc-fe37ade2daf0-kube-api-access-7k648\") pod \"auto-csr-approver-29563994-p6gb2\" (UID: \"ace5d76e-1744-4242-99bc-fe37ade2daf0\") " pod="openshift-infra/auto-csr-approver-29563994-p6gb2" Mar 18 13:14:00 crc kubenswrapper[4975]: I0318 13:14:00.461368 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k648\" (UniqueName: \"kubernetes.io/projected/ace5d76e-1744-4242-99bc-fe37ade2daf0-kube-api-access-7k648\") pod \"auto-csr-approver-29563994-p6gb2\" (UID: \"ace5d76e-1744-4242-99bc-fe37ade2daf0\") " pod="openshift-infra/auto-csr-approver-29563994-p6gb2" Mar 18 13:14:00 crc kubenswrapper[4975]: I0318 13:14:00.483840 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k648\" (UniqueName: \"kubernetes.io/projected/ace5d76e-1744-4242-99bc-fe37ade2daf0-kube-api-access-7k648\") pod \"auto-csr-approver-29563994-p6gb2\" (UID: \"ace5d76e-1744-4242-99bc-fe37ade2daf0\") " pod="openshift-infra/auto-csr-approver-29563994-p6gb2" Mar 18 13:14:00 crc kubenswrapper[4975]: I0318 13:14:00.499293 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-p6gb2" Mar 18 13:14:00 crc kubenswrapper[4975]: I0318 13:14:00.935466 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-p6gb2"] Mar 18 13:14:01 crc kubenswrapper[4975]: I0318 13:14:01.549244 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563994-p6gb2" event={"ID":"ace5d76e-1744-4242-99bc-fe37ade2daf0","Type":"ContainerStarted","Data":"4379ae570aa82cde36495eeb630faa3d8bb60b429e9ed856742724c830a88973"} Mar 18 13:14:03 crc kubenswrapper[4975]: I0318 13:14:03.569687 4975 generic.go:334] "Generic (PLEG): container finished" podID="ace5d76e-1744-4242-99bc-fe37ade2daf0" containerID="f87d0433c2645e208caafd4830582c83e1c2c537c13edb98378b6fd0a9abe615" exitCode=0 Mar 18 13:14:03 crc kubenswrapper[4975]: I0318 13:14:03.569771 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563994-p6gb2" event={"ID":"ace5d76e-1744-4242-99bc-fe37ade2daf0","Type":"ContainerDied","Data":"f87d0433c2645e208caafd4830582c83e1c2c537c13edb98378b6fd0a9abe615"} Mar 18 13:14:04 crc kubenswrapper[4975]: I0318 13:14:04.911615 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-p6gb2" Mar 18 13:14:04 crc kubenswrapper[4975]: I0318 13:14:04.973086 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k648\" (UniqueName: \"kubernetes.io/projected/ace5d76e-1744-4242-99bc-fe37ade2daf0-kube-api-access-7k648\") pod \"ace5d76e-1744-4242-99bc-fe37ade2daf0\" (UID: \"ace5d76e-1744-4242-99bc-fe37ade2daf0\") " Mar 18 13:14:04 crc kubenswrapper[4975]: I0318 13:14:04.978755 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace5d76e-1744-4242-99bc-fe37ade2daf0-kube-api-access-7k648" (OuterVolumeSpecName: "kube-api-access-7k648") pod "ace5d76e-1744-4242-99bc-fe37ade2daf0" (UID: "ace5d76e-1744-4242-99bc-fe37ade2daf0"). InnerVolumeSpecName "kube-api-access-7k648". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:14:05 crc kubenswrapper[4975]: I0318 13:14:05.075975 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k648\" (UniqueName: \"kubernetes.io/projected/ace5d76e-1744-4242-99bc-fe37ade2daf0-kube-api-access-7k648\") on node \"crc\" DevicePath \"\"" Mar 18 13:14:05 crc kubenswrapper[4975]: I0318 13:14:05.591793 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563994-p6gb2" event={"ID":"ace5d76e-1744-4242-99bc-fe37ade2daf0","Type":"ContainerDied","Data":"4379ae570aa82cde36495eeb630faa3d8bb60b429e9ed856742724c830a88973"} Mar 18 13:14:05 crc kubenswrapper[4975]: I0318 13:14:05.592220 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4379ae570aa82cde36495eeb630faa3d8bb60b429e9ed856742724c830a88973" Mar 18 13:14:05 crc kubenswrapper[4975]: I0318 13:14:05.591901 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563994-p6gb2" Mar 18 13:14:05 crc kubenswrapper[4975]: I0318 13:14:05.984773 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-bx2df"] Mar 18 13:14:05 crc kubenswrapper[4975]: I0318 13:14:05.992566 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563988-bx2df"] Mar 18 13:14:07 crc kubenswrapper[4975]: I0318 13:14:07.028507 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="710e8a6f-4426-4ac3-af0b-0e30524a784e" path="/var/lib/kubelet/pods/710e8a6f-4426-4ac3-af0b-0e30524a784e/volumes" Mar 18 13:14:08 crc kubenswrapper[4975]: I0318 13:14:08.017485 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:14:08 crc kubenswrapper[4975]: E0318 13:14:08.018275 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:14:23 crc kubenswrapper[4975]: I0318 13:14:23.017063 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:14:23 crc kubenswrapper[4975]: E0318 13:14:23.017930 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:14:35 crc kubenswrapper[4975]: I0318 13:14:35.025602 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:14:35 crc kubenswrapper[4975]: E0318 13:14:35.026557 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:14:46 crc kubenswrapper[4975]: I0318 13:14:46.016622 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:14:46 crc kubenswrapper[4975]: E0318 13:14:46.017354 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:14:49 crc kubenswrapper[4975]: I0318 13:14:49.149233 4975 scope.go:117] "RemoveContainer" containerID="b4a6697f7cc9c1a19389026a8ce8237bddc7e97106c397e7c423ad63a2f78205" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.159947 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l"] Mar 18 13:15:00 crc kubenswrapper[4975]: E0318 13:15:00.161092 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace5d76e-1744-4242-99bc-fe37ade2daf0" containerName="oc" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.161112 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace5d76e-1744-4242-99bc-fe37ade2daf0" containerName="oc" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.161386 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace5d76e-1744-4242-99bc-fe37ade2daf0" containerName="oc" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.162212 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.165265 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.165518 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.170231 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l"] Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.239004 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/44b63712-d649-4905-ba44-0cd603fcb714-secret-volume\") pod \"collect-profiles-29563995-tf42l\" (UID: \"44b63712-d649-4905-ba44-0cd603fcb714\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.239075 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44b63712-d649-4905-ba44-0cd603fcb714-config-volume\") pod \"collect-profiles-29563995-tf42l\" (UID: \"44b63712-d649-4905-ba44-0cd603fcb714\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.239108 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wwlw\" (UniqueName: \"kubernetes.io/projected/44b63712-d649-4905-ba44-0cd603fcb714-kube-api-access-2wwlw\") pod \"collect-profiles-29563995-tf42l\" (UID: \"44b63712-d649-4905-ba44-0cd603fcb714\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.340425 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wwlw\" (UniqueName: \"kubernetes.io/projected/44b63712-d649-4905-ba44-0cd603fcb714-kube-api-access-2wwlw\") pod \"collect-profiles-29563995-tf42l\" (UID: \"44b63712-d649-4905-ba44-0cd603fcb714\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.340862 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/44b63712-d649-4905-ba44-0cd603fcb714-secret-volume\") pod \"collect-profiles-29563995-tf42l\" (UID: \"44b63712-d649-4905-ba44-0cd603fcb714\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.340912 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44b63712-d649-4905-ba44-0cd603fcb714-config-volume\") pod \"collect-profiles-29563995-tf42l\" (UID: \"44b63712-d649-4905-ba44-0cd603fcb714\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.342022 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44b63712-d649-4905-ba44-0cd603fcb714-config-volume\") pod \"collect-profiles-29563995-tf42l\" (UID: \"44b63712-d649-4905-ba44-0cd603fcb714\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.348191 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/44b63712-d649-4905-ba44-0cd603fcb714-secret-volume\") pod \"collect-profiles-29563995-tf42l\" (UID: \"44b63712-d649-4905-ba44-0cd603fcb714\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.360937 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wwlw\" (UniqueName: \"kubernetes.io/projected/44b63712-d649-4905-ba44-0cd603fcb714-kube-api-access-2wwlw\") pod \"collect-profiles-29563995-tf42l\" (UID: \"44b63712-d649-4905-ba44-0cd603fcb714\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.493113 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" Mar 18 13:15:00 crc kubenswrapper[4975]: I0318 13:15:00.922839 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l"] Mar 18 13:15:01 crc kubenswrapper[4975]: I0318 13:15:01.018204 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:15:01 crc kubenswrapper[4975]: E0318 13:15:01.021357 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:15:01 crc kubenswrapper[4975]: I0318 13:15:01.075301 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" event={"ID":"44b63712-d649-4905-ba44-0cd603fcb714","Type":"ContainerStarted","Data":"0675c01599a1f6aac2f59e2908c239594e41e54975404cf855f7070e262d1a2a"} Mar 18 13:15:02 crc kubenswrapper[4975]: I0318 13:15:02.085637 4975 generic.go:334] "Generic (PLEG): container finished" podID="44b63712-d649-4905-ba44-0cd603fcb714" containerID="15089df9a1c7a3c1d906066e54429d98b9e7e07f38d870c554d2f877dce47a30" exitCode=0 Mar 18 13:15:02 crc kubenswrapper[4975]: I0318 13:15:02.085694 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" event={"ID":"44b63712-d649-4905-ba44-0cd603fcb714","Type":"ContainerDied","Data":"15089df9a1c7a3c1d906066e54429d98b9e7e07f38d870c554d2f877dce47a30"} Mar 18 13:15:03 crc kubenswrapper[4975]: I0318 13:15:03.442353 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" Mar 18 13:15:03 crc kubenswrapper[4975]: I0318 13:15:03.598073 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wwlw\" (UniqueName: \"kubernetes.io/projected/44b63712-d649-4905-ba44-0cd603fcb714-kube-api-access-2wwlw\") pod \"44b63712-d649-4905-ba44-0cd603fcb714\" (UID: \"44b63712-d649-4905-ba44-0cd603fcb714\") " Mar 18 13:15:03 crc kubenswrapper[4975]: I0318 13:15:03.598131 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/44b63712-d649-4905-ba44-0cd603fcb714-secret-volume\") pod \"44b63712-d649-4905-ba44-0cd603fcb714\" (UID: \"44b63712-d649-4905-ba44-0cd603fcb714\") " Mar 18 13:15:03 crc kubenswrapper[4975]: I0318 13:15:03.598255 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44b63712-d649-4905-ba44-0cd603fcb714-config-volume\") pod \"44b63712-d649-4905-ba44-0cd603fcb714\" (UID: \"44b63712-d649-4905-ba44-0cd603fcb714\") " Mar 18 13:15:03 crc kubenswrapper[4975]: I0318 13:15:03.599099 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44b63712-d649-4905-ba44-0cd603fcb714-config-volume" (OuterVolumeSpecName: "config-volume") pod "44b63712-d649-4905-ba44-0cd603fcb714" (UID: "44b63712-d649-4905-ba44-0cd603fcb714"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:15:03 crc kubenswrapper[4975]: I0318 13:15:03.606524 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44b63712-d649-4905-ba44-0cd603fcb714-kube-api-access-2wwlw" (OuterVolumeSpecName: "kube-api-access-2wwlw") pod "44b63712-d649-4905-ba44-0cd603fcb714" (UID: "44b63712-d649-4905-ba44-0cd603fcb714"). InnerVolumeSpecName "kube-api-access-2wwlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:15:03 crc kubenswrapper[4975]: I0318 13:15:03.606677 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b63712-d649-4905-ba44-0cd603fcb714-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "44b63712-d649-4905-ba44-0cd603fcb714" (UID: "44b63712-d649-4905-ba44-0cd603fcb714"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:15:03 crc kubenswrapper[4975]: I0318 13:15:03.700818 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wwlw\" (UniqueName: \"kubernetes.io/projected/44b63712-d649-4905-ba44-0cd603fcb714-kube-api-access-2wwlw\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:03 crc kubenswrapper[4975]: I0318 13:15:03.700885 4975 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/44b63712-d649-4905-ba44-0cd603fcb714-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:03 crc kubenswrapper[4975]: I0318 13:15:03.700901 4975 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44b63712-d649-4905-ba44-0cd603fcb714-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:04 crc kubenswrapper[4975]: I0318 13:15:04.104981 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" event={"ID":"44b63712-d649-4905-ba44-0cd603fcb714","Type":"ContainerDied","Data":"0675c01599a1f6aac2f59e2908c239594e41e54975404cf855f7070e262d1a2a"} Mar 18 13:15:04 crc kubenswrapper[4975]: I0318 13:15:04.105028 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0675c01599a1f6aac2f59e2908c239594e41e54975404cf855f7070e262d1a2a" Mar 18 13:15:04 crc kubenswrapper[4975]: I0318 13:15:04.105090 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563995-tf42l" Mar 18 13:15:04 crc kubenswrapper[4975]: I0318 13:15:04.521029 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d"] Mar 18 13:15:04 crc kubenswrapper[4975]: I0318 13:15:04.529556 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563950-26v6d"] Mar 18 13:15:05 crc kubenswrapper[4975]: I0318 13:15:05.028824 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="565e8dc1-44b2-4dd9-9653-950d50c6d914" path="/var/lib/kubelet/pods/565e8dc1-44b2-4dd9-9653-950d50c6d914/volumes" Mar 18 13:15:14 crc kubenswrapper[4975]: I0318 13:15:14.016731 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:15:14 crc kubenswrapper[4975]: E0318 13:15:14.017573 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:15:28 crc kubenswrapper[4975]: I0318 13:15:28.329732 4975 generic.go:334] "Generic (PLEG): container finished" podID="60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492" containerID="74a241acb9d453851c684e3b4c5e5d2492ed13cc4b502f58151581d4a3c4f374" exitCode=2 Mar 18 13:15:28 crc kubenswrapper[4975]: I0318 13:15:28.329839 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" event={"ID":"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492","Type":"ContainerDied","Data":"74a241acb9d453851c684e3b4c5e5d2492ed13cc4b502f58151581d4a3c4f374"} Mar 18 13:15:29 crc kubenswrapper[4975]: I0318 13:15:29.017044 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:15:29 crc kubenswrapper[4975]: E0318 13:15:29.017370 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:15:29 crc kubenswrapper[4975]: I0318 13:15:29.839984 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:15:29 crc kubenswrapper[4975]: I0318 13:15:29.999437 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-ssh-key-openstack-edpm-ipam\") pod \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " Mar 18 13:15:29 crc kubenswrapper[4975]: I0318 13:15:29.999508 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-libvirt-secret-0\") pod \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " Mar 18 13:15:29 crc kubenswrapper[4975]: I0318 13:15:29.999689 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn2bt\" (UniqueName: \"kubernetes.io/projected/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-kube-api-access-rn2bt\") pod \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " Mar 18 13:15:29 crc kubenswrapper[4975]: I0318 13:15:29.999822 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-inventory\") pod \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " Mar 18 13:15:30 crc kubenswrapper[4975]: I0318 13:15:29.999886 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-libvirt-combined-ca-bundle\") pod \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\" (UID: \"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492\") " Mar 18 13:15:30 crc kubenswrapper[4975]: I0318 13:15:30.004992 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-kube-api-access-rn2bt" (OuterVolumeSpecName: "kube-api-access-rn2bt") pod "60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492" (UID: "60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492"). InnerVolumeSpecName "kube-api-access-rn2bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:15:30 crc kubenswrapper[4975]: I0318 13:15:30.009980 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492" (UID: "60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:15:30 crc kubenswrapper[4975]: I0318 13:15:30.025894 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492" (UID: "60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:15:30 crc kubenswrapper[4975]: I0318 13:15:30.028402 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-inventory" (OuterVolumeSpecName: "inventory") pod "60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492" (UID: "60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:15:30 crc kubenswrapper[4975]: I0318 13:15:30.030954 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492" (UID: "60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:15:30 crc kubenswrapper[4975]: I0318 13:15:30.102246 4975 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:30 crc kubenswrapper[4975]: I0318 13:15:30.102283 4975 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:30 crc kubenswrapper[4975]: I0318 13:15:30.102291 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn2bt\" (UniqueName: \"kubernetes.io/projected/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-kube-api-access-rn2bt\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:30 crc kubenswrapper[4975]: I0318 13:15:30.102302 4975 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:30 crc kubenswrapper[4975]: I0318 13:15:30.102311 4975 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 13:15:30 crc kubenswrapper[4975]: I0318 13:15:30.348881 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" event={"ID":"60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492","Type":"ContainerDied","Data":"e85b998a0e79b474746e0e64f50f75208e5800b07245a3cbcb7d78ccd58e3e37"} Mar 18 13:15:30 crc kubenswrapper[4975]: I0318 13:15:30.348934 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e85b998a0e79b474746e0e64f50f75208e5800b07245a3cbcb7d78ccd58e3e37" Mar 18 13:15:30 crc kubenswrapper[4975]: I0318 13:15:30.348961 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm" Mar 18 13:15:41 crc kubenswrapper[4975]: I0318 13:15:41.017143 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:15:41 crc kubenswrapper[4975]: E0318 13:15:41.018691 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:15:49 crc kubenswrapper[4975]: I0318 13:15:49.230080 4975 scope.go:117] "RemoveContainer" containerID="e6b17942904af52284aa23e5b388e902ee8ecc96432b3da5e06998c86a351b9d" Mar 18 13:15:55 crc kubenswrapper[4975]: I0318 13:15:55.029664 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:15:55 crc kubenswrapper[4975]: E0318 13:15:55.030657 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:16:00 crc kubenswrapper[4975]: I0318 13:16:00.156068 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563996-ct4tw"] Mar 18 13:16:00 crc kubenswrapper[4975]: E0318 13:16:00.156978 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:16:00 crc kubenswrapper[4975]: I0318 13:16:00.156994 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:16:00 crc kubenswrapper[4975]: E0318 13:16:00.157018 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b63712-d649-4905-ba44-0cd603fcb714" containerName="collect-profiles" Mar 18 13:16:00 crc kubenswrapper[4975]: I0318 13:16:00.157025 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b63712-d649-4905-ba44-0cd603fcb714" containerName="collect-profiles" Mar 18 13:16:00 crc kubenswrapper[4975]: I0318 13:16:00.157235 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="44b63712-d649-4905-ba44-0cd603fcb714" containerName="collect-profiles" Mar 18 13:16:00 crc kubenswrapper[4975]: I0318 13:16:00.157262 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 18 13:16:00 crc kubenswrapper[4975]: I0318 13:16:00.157852 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-ct4tw" Mar 18 13:16:00 crc kubenswrapper[4975]: I0318 13:16:00.164848 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-ct4tw"] Mar 18 13:16:00 crc kubenswrapper[4975]: I0318 13:16:00.199481 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:16:00 crc kubenswrapper[4975]: I0318 13:16:00.199649 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:16:00 crc kubenswrapper[4975]: I0318 13:16:00.199810 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:16:00 crc kubenswrapper[4975]: I0318 13:16:00.314415 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p9xj\" (UniqueName: \"kubernetes.io/projected/98444df3-0049-4f94-ac53-10a8e2801803-kube-api-access-2p9xj\") pod \"auto-csr-approver-29563996-ct4tw\" (UID: \"98444df3-0049-4f94-ac53-10a8e2801803\") " pod="openshift-infra/auto-csr-approver-29563996-ct4tw" Mar 18 13:16:00 crc kubenswrapper[4975]: I0318 13:16:00.416634 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p9xj\" (UniqueName: \"kubernetes.io/projected/98444df3-0049-4f94-ac53-10a8e2801803-kube-api-access-2p9xj\") pod \"auto-csr-approver-29563996-ct4tw\" (UID: \"98444df3-0049-4f94-ac53-10a8e2801803\") " pod="openshift-infra/auto-csr-approver-29563996-ct4tw" Mar 18 13:16:00 crc kubenswrapper[4975]: I0318 13:16:00.434401 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p9xj\" (UniqueName: \"kubernetes.io/projected/98444df3-0049-4f94-ac53-10a8e2801803-kube-api-access-2p9xj\") pod \"auto-csr-approver-29563996-ct4tw\" (UID: \"98444df3-0049-4f94-ac53-10a8e2801803\") " pod="openshift-infra/auto-csr-approver-29563996-ct4tw" Mar 18 13:16:00 crc kubenswrapper[4975]: I0318 13:16:00.530750 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-ct4tw" Mar 18 13:16:00 crc kubenswrapper[4975]: I0318 13:16:00.977595 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-ct4tw"] Mar 18 13:16:01 crc kubenswrapper[4975]: I0318 13:16:01.641223 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563996-ct4tw" event={"ID":"98444df3-0049-4f94-ac53-10a8e2801803","Type":"ContainerStarted","Data":"c8237119e80071771d8343f7b4a1dc13784a7c4d0c47fa4b9814f6ad4c4a1775"} Mar 18 13:16:03 crc kubenswrapper[4975]: I0318 13:16:03.658543 4975 generic.go:334] "Generic (PLEG): container finished" podID="98444df3-0049-4f94-ac53-10a8e2801803" containerID="d826d199efabc11f7632a1d31c271bd8cd911d373d88660a1a8efb1bf8398c7e" exitCode=0 Mar 18 13:16:03 crc kubenswrapper[4975]: I0318 13:16:03.658607 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563996-ct4tw" event={"ID":"98444df3-0049-4f94-ac53-10a8e2801803","Type":"ContainerDied","Data":"d826d199efabc11f7632a1d31c271bd8cd911d373d88660a1a8efb1bf8398c7e"} Mar 18 13:16:05 crc kubenswrapper[4975]: I0318 13:16:05.195050 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-ct4tw" Mar 18 13:16:05 crc kubenswrapper[4975]: I0318 13:16:05.308379 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p9xj\" (UniqueName: \"kubernetes.io/projected/98444df3-0049-4f94-ac53-10a8e2801803-kube-api-access-2p9xj\") pod \"98444df3-0049-4f94-ac53-10a8e2801803\" (UID: \"98444df3-0049-4f94-ac53-10a8e2801803\") " Mar 18 13:16:05 crc kubenswrapper[4975]: I0318 13:16:05.313913 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98444df3-0049-4f94-ac53-10a8e2801803-kube-api-access-2p9xj" (OuterVolumeSpecName: "kube-api-access-2p9xj") pod "98444df3-0049-4f94-ac53-10a8e2801803" (UID: "98444df3-0049-4f94-ac53-10a8e2801803"). InnerVolumeSpecName "kube-api-access-2p9xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:16:05 crc kubenswrapper[4975]: I0318 13:16:05.410205 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p9xj\" (UniqueName: \"kubernetes.io/projected/98444df3-0049-4f94-ac53-10a8e2801803-kube-api-access-2p9xj\") on node \"crc\" DevicePath \"\"" Mar 18 13:16:05 crc kubenswrapper[4975]: I0318 13:16:05.675998 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563996-ct4tw" event={"ID":"98444df3-0049-4f94-ac53-10a8e2801803","Type":"ContainerDied","Data":"c8237119e80071771d8343f7b4a1dc13784a7c4d0c47fa4b9814f6ad4c4a1775"} Mar 18 13:16:05 crc kubenswrapper[4975]: I0318 13:16:05.676305 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8237119e80071771d8343f7b4a1dc13784a7c4d0c47fa4b9814f6ad4c4a1775" Mar 18 13:16:05 crc kubenswrapper[4975]: I0318 13:16:05.676092 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563996-ct4tw" Mar 18 13:16:06 crc kubenswrapper[4975]: I0318 13:16:06.280596 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-wnkm2"] Mar 18 13:16:06 crc kubenswrapper[4975]: I0318 13:16:06.288281 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563990-wnkm2"] Mar 18 13:16:07 crc kubenswrapper[4975]: I0318 13:16:07.027960 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="151a5e6d-1835-4362-97d5-ecc9d5e22843" path="/var/lib/kubelet/pods/151a5e6d-1835-4362-97d5-ecc9d5e22843/volumes" Mar 18 13:16:09 crc kubenswrapper[4975]: I0318 13:16:09.016681 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:16:09 crc kubenswrapper[4975]: I0318 13:16:09.730383 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"a6d49e88f39747f8437850d0e06984a0c76dfd4bda387e9721407ad29e85a872"} Mar 18 13:16:49 crc kubenswrapper[4975]: I0318 13:16:49.295345 4975 scope.go:117] "RemoveContainer" containerID="656c57f54094e5b48e1c433e74566c7910d037492cc4c687368a8a4d4b9e96b2" Mar 18 13:18:00 crc kubenswrapper[4975]: I0318 13:18:00.151759 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563998-ngkj4"] Mar 18 13:18:00 crc kubenswrapper[4975]: E0318 13:18:00.153511 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98444df3-0049-4f94-ac53-10a8e2801803" containerName="oc" Mar 18 13:18:00 crc kubenswrapper[4975]: I0318 13:18:00.153533 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="98444df3-0049-4f94-ac53-10a8e2801803" containerName="oc" Mar 18 13:18:00 crc kubenswrapper[4975]: I0318 13:18:00.154823 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="98444df3-0049-4f94-ac53-10a8e2801803" containerName="oc" Mar 18 13:18:00 crc kubenswrapper[4975]: I0318 13:18:00.155679 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-ngkj4" Mar 18 13:18:00 crc kubenswrapper[4975]: I0318 13:18:00.170138 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-ngkj4"] Mar 18 13:18:00 crc kubenswrapper[4975]: I0318 13:18:00.188592 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:18:00 crc kubenswrapper[4975]: I0318 13:18:00.188640 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:18:00 crc kubenswrapper[4975]: I0318 13:18:00.188603 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:18:00 crc kubenswrapper[4975]: I0318 13:18:00.250487 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dwxt\" (UniqueName: \"kubernetes.io/projected/ad5383b1-c31d-4a26-b102-9a808885fb6a-kube-api-access-4dwxt\") pod \"auto-csr-approver-29563998-ngkj4\" (UID: \"ad5383b1-c31d-4a26-b102-9a808885fb6a\") " pod="openshift-infra/auto-csr-approver-29563998-ngkj4" Mar 18 13:18:00 crc kubenswrapper[4975]: I0318 13:18:00.352908 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dwxt\" (UniqueName: \"kubernetes.io/projected/ad5383b1-c31d-4a26-b102-9a808885fb6a-kube-api-access-4dwxt\") pod \"auto-csr-approver-29563998-ngkj4\" (UID: \"ad5383b1-c31d-4a26-b102-9a808885fb6a\") " pod="openshift-infra/auto-csr-approver-29563998-ngkj4" Mar 18 13:18:00 crc kubenswrapper[4975]: I0318 13:18:00.375445 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dwxt\" (UniqueName: \"kubernetes.io/projected/ad5383b1-c31d-4a26-b102-9a808885fb6a-kube-api-access-4dwxt\") pod \"auto-csr-approver-29563998-ngkj4\" (UID: \"ad5383b1-c31d-4a26-b102-9a808885fb6a\") " pod="openshift-infra/auto-csr-approver-29563998-ngkj4" Mar 18 13:18:00 crc kubenswrapper[4975]: I0318 13:18:00.515159 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-ngkj4" Mar 18 13:18:00 crc kubenswrapper[4975]: I0318 13:18:00.939812 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-ngkj4"] Mar 18 13:18:01 crc kubenswrapper[4975]: I0318 13:18:01.703388 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563998-ngkj4" event={"ID":"ad5383b1-c31d-4a26-b102-9a808885fb6a","Type":"ContainerStarted","Data":"bd440a559c27a23d4d335033306f0d395c98d5c7d005d05efe7636401edf2afb"} Mar 18 13:18:03 crc kubenswrapper[4975]: I0318 13:18:03.726121 4975 generic.go:334] "Generic (PLEG): container finished" podID="ad5383b1-c31d-4a26-b102-9a808885fb6a" containerID="96695f0b407dc7d00c6afe4eb65c3545954c7adc260d04a852b221fab8a2a2c9" exitCode=0 Mar 18 13:18:03 crc kubenswrapper[4975]: I0318 13:18:03.726183 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563998-ngkj4" event={"ID":"ad5383b1-c31d-4a26-b102-9a808885fb6a","Type":"ContainerDied","Data":"96695f0b407dc7d00c6afe4eb65c3545954c7adc260d04a852b221fab8a2a2c9"} Mar 18 13:18:05 crc kubenswrapper[4975]: I0318 13:18:05.079250 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-ngkj4" Mar 18 13:18:05 crc kubenswrapper[4975]: I0318 13:18:05.145435 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dwxt\" (UniqueName: \"kubernetes.io/projected/ad5383b1-c31d-4a26-b102-9a808885fb6a-kube-api-access-4dwxt\") pod \"ad5383b1-c31d-4a26-b102-9a808885fb6a\" (UID: \"ad5383b1-c31d-4a26-b102-9a808885fb6a\") " Mar 18 13:18:05 crc kubenswrapper[4975]: I0318 13:18:05.151404 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5383b1-c31d-4a26-b102-9a808885fb6a-kube-api-access-4dwxt" (OuterVolumeSpecName: "kube-api-access-4dwxt") pod "ad5383b1-c31d-4a26-b102-9a808885fb6a" (UID: "ad5383b1-c31d-4a26-b102-9a808885fb6a"). InnerVolumeSpecName "kube-api-access-4dwxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:18:05 crc kubenswrapper[4975]: I0318 13:18:05.247368 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dwxt\" (UniqueName: \"kubernetes.io/projected/ad5383b1-c31d-4a26-b102-9a808885fb6a-kube-api-access-4dwxt\") on node \"crc\" DevicePath \"\"" Mar 18 13:18:05 crc kubenswrapper[4975]: I0318 13:18:05.743100 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563998-ngkj4" event={"ID":"ad5383b1-c31d-4a26-b102-9a808885fb6a","Type":"ContainerDied","Data":"bd440a559c27a23d4d335033306f0d395c98d5c7d005d05efe7636401edf2afb"} Mar 18 13:18:05 crc kubenswrapper[4975]: I0318 13:18:05.743155 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd440a559c27a23d4d335033306f0d395c98d5c7d005d05efe7636401edf2afb" Mar 18 13:18:05 crc kubenswrapper[4975]: I0318 13:18:05.743171 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563998-ngkj4" Mar 18 13:18:06 crc kubenswrapper[4975]: I0318 13:18:06.166222 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-7q8vw"] Mar 18 13:18:06 crc kubenswrapper[4975]: I0318 13:18:06.182527 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563992-7q8vw"] Mar 18 13:18:07 crc kubenswrapper[4975]: I0318 13:18:07.030059 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9" path="/var/lib/kubelet/pods/d7747fe7-e1e2-4a73-b4e6-6d7407fb0ed9/volumes" Mar 18 13:18:25 crc kubenswrapper[4975]: I0318 13:18:25.539262 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:18:25 crc kubenswrapper[4975]: I0318 13:18:25.539711 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:18:49 crc kubenswrapper[4975]: I0318 13:18:49.406287 4975 scope.go:117] "RemoveContainer" containerID="523c2dba508106ee79fccf6321b9ce35d52b0de61bc216862c083bb99a04facf" Mar 18 13:18:55 crc kubenswrapper[4975]: I0318 13:18:55.539261 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:18:55 crc kubenswrapper[4975]: I0318 13:18:55.539996 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:19:25 crc kubenswrapper[4975]: I0318 13:19:25.539618 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:19:25 crc kubenswrapper[4975]: I0318 13:19:25.541070 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:19:25 crc kubenswrapper[4975]: I0318 13:19:25.541158 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 13:19:25 crc kubenswrapper[4975]: I0318 13:19:25.542135 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a6d49e88f39747f8437850d0e06984a0c76dfd4bda387e9721407ad29e85a872"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:19:25 crc kubenswrapper[4975]: I0318 13:19:25.542205 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://a6d49e88f39747f8437850d0e06984a0c76dfd4bda387e9721407ad29e85a872" gracePeriod=600 Mar 18 13:19:26 crc kubenswrapper[4975]: I0318 13:19:26.485397 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="a6d49e88f39747f8437850d0e06984a0c76dfd4bda387e9721407ad29e85a872" exitCode=0 Mar 18 13:19:26 crc kubenswrapper[4975]: I0318 13:19:26.485468 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"a6d49e88f39747f8437850d0e06984a0c76dfd4bda387e9721407ad29e85a872"} Mar 18 13:19:26 crc kubenswrapper[4975]: I0318 13:19:26.485751 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea"} Mar 18 13:19:26 crc kubenswrapper[4975]: I0318 13:19:26.485791 4975 scope.go:117] "RemoveContainer" containerID="d5b3dcdac753e884fe30a90299bf2855d65ccee5dc9eab05ceed374c8bf18eb3" Mar 18 13:20:00 crc kubenswrapper[4975]: I0318 13:20:00.146037 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564000-pcfw8"] Mar 18 13:20:00 crc kubenswrapper[4975]: E0318 13:20:00.147289 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5383b1-c31d-4a26-b102-9a808885fb6a" containerName="oc" Mar 18 13:20:00 crc kubenswrapper[4975]: I0318 13:20:00.147310 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5383b1-c31d-4a26-b102-9a808885fb6a" containerName="oc" Mar 18 13:20:00 crc kubenswrapper[4975]: I0318 13:20:00.147584 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad5383b1-c31d-4a26-b102-9a808885fb6a" containerName="oc" Mar 18 13:20:00 crc kubenswrapper[4975]: I0318 13:20:00.148498 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-pcfw8" Mar 18 13:20:00 crc kubenswrapper[4975]: I0318 13:20:00.155516 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-pcfw8"] Mar 18 13:20:00 crc kubenswrapper[4975]: I0318 13:20:00.155852 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:20:00 crc kubenswrapper[4975]: I0318 13:20:00.155914 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:20:00 crc kubenswrapper[4975]: I0318 13:20:00.156197 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:20:00 crc kubenswrapper[4975]: I0318 13:20:00.252140 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sklb7\" (UniqueName: \"kubernetes.io/projected/86b9680b-512d-497d-9b53-366daa112aa3-kube-api-access-sklb7\") pod \"auto-csr-approver-29564000-pcfw8\" (UID: \"86b9680b-512d-497d-9b53-366daa112aa3\") " pod="openshift-infra/auto-csr-approver-29564000-pcfw8" Mar 18 13:20:00 crc kubenswrapper[4975]: I0318 13:20:00.354493 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sklb7\" (UniqueName: \"kubernetes.io/projected/86b9680b-512d-497d-9b53-366daa112aa3-kube-api-access-sklb7\") pod \"auto-csr-approver-29564000-pcfw8\" (UID: \"86b9680b-512d-497d-9b53-366daa112aa3\") " pod="openshift-infra/auto-csr-approver-29564000-pcfw8" Mar 18 13:20:00 crc kubenswrapper[4975]: I0318 13:20:00.372602 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sklb7\" (UniqueName: \"kubernetes.io/projected/86b9680b-512d-497d-9b53-366daa112aa3-kube-api-access-sklb7\") pod \"auto-csr-approver-29564000-pcfw8\" (UID: \"86b9680b-512d-497d-9b53-366daa112aa3\") " pod="openshift-infra/auto-csr-approver-29564000-pcfw8" Mar 18 13:20:00 crc kubenswrapper[4975]: I0318 13:20:00.473979 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-pcfw8" Mar 18 13:20:00 crc kubenswrapper[4975]: I0318 13:20:00.926388 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-pcfw8"] Mar 18 13:20:00 crc kubenswrapper[4975]: I0318 13:20:00.935085 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:20:01 crc kubenswrapper[4975]: I0318 13:20:01.840543 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564000-pcfw8" event={"ID":"86b9680b-512d-497d-9b53-366daa112aa3","Type":"ContainerStarted","Data":"4264477f69ec1668d215b847deb9f75c2c210a441f3cc2ccd0a7060d58983d1a"} Mar 18 13:20:02 crc kubenswrapper[4975]: I0318 13:20:02.850473 4975 generic.go:334] "Generic (PLEG): container finished" podID="86b9680b-512d-497d-9b53-366daa112aa3" containerID="005f42abe3c2d6c8cea7de7fe74c6adf6de64e00397b46f167d2ee4f01d7cd68" exitCode=0 Mar 18 13:20:02 crc kubenswrapper[4975]: I0318 13:20:02.850529 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564000-pcfw8" event={"ID":"86b9680b-512d-497d-9b53-366daa112aa3","Type":"ContainerDied","Data":"005f42abe3c2d6c8cea7de7fe74c6adf6de64e00397b46f167d2ee4f01d7cd68"} Mar 18 13:20:04 crc kubenswrapper[4975]: I0318 13:20:04.871601 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564000-pcfw8" event={"ID":"86b9680b-512d-497d-9b53-366daa112aa3","Type":"ContainerDied","Data":"4264477f69ec1668d215b847deb9f75c2c210a441f3cc2ccd0a7060d58983d1a"} Mar 18 13:20:04 crc kubenswrapper[4975]: I0318 13:20:04.871948 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4264477f69ec1668d215b847deb9f75c2c210a441f3cc2ccd0a7060d58983d1a" Mar 18 13:20:04 crc kubenswrapper[4975]: I0318 13:20:04.880291 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-pcfw8" Mar 18 13:20:05 crc kubenswrapper[4975]: I0318 13:20:05.060005 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sklb7\" (UniqueName: \"kubernetes.io/projected/86b9680b-512d-497d-9b53-366daa112aa3-kube-api-access-sklb7\") pod \"86b9680b-512d-497d-9b53-366daa112aa3\" (UID: \"86b9680b-512d-497d-9b53-366daa112aa3\") " Mar 18 13:20:05 crc kubenswrapper[4975]: I0318 13:20:05.065984 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86b9680b-512d-497d-9b53-366daa112aa3-kube-api-access-sklb7" (OuterVolumeSpecName: "kube-api-access-sklb7") pod "86b9680b-512d-497d-9b53-366daa112aa3" (UID: "86b9680b-512d-497d-9b53-366daa112aa3"). InnerVolumeSpecName "kube-api-access-sklb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:20:05 crc kubenswrapper[4975]: I0318 13:20:05.163774 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sklb7\" (UniqueName: \"kubernetes.io/projected/86b9680b-512d-497d-9b53-366daa112aa3-kube-api-access-sklb7\") on node \"crc\" DevicePath \"\"" Mar 18 13:20:05 crc kubenswrapper[4975]: I0318 13:20:05.884780 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564000-pcfw8" Mar 18 13:20:05 crc kubenswrapper[4975]: I0318 13:20:05.971561 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-p6gb2"] Mar 18 13:20:05 crc kubenswrapper[4975]: I0318 13:20:05.988766 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563994-p6gb2"] Mar 18 13:20:07 crc kubenswrapper[4975]: I0318 13:20:07.032459 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace5d76e-1744-4242-99bc-fe37ade2daf0" path="/var/lib/kubelet/pods/ace5d76e-1744-4242-99bc-fe37ade2daf0/volumes" Mar 18 13:20:49 crc kubenswrapper[4975]: I0318 13:20:49.496392 4975 scope.go:117] "RemoveContainer" containerID="f87d0433c2645e208caafd4830582c83e1c2c537c13edb98378b6fd0a9abe615" Mar 18 13:21:25 crc kubenswrapper[4975]: I0318 13:21:25.539185 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:21:25 crc kubenswrapper[4975]: I0318 13:21:25.539953 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:21:55 crc kubenswrapper[4975]: I0318 13:21:55.539183 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:21:55 crc kubenswrapper[4975]: I0318 13:21:55.541418 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:22:00 crc kubenswrapper[4975]: I0318 13:22:00.157434 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564002-7k22c"] Mar 18 13:22:00 crc kubenswrapper[4975]: E0318 13:22:00.158481 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b9680b-512d-497d-9b53-366daa112aa3" containerName="oc" Mar 18 13:22:00 crc kubenswrapper[4975]: I0318 13:22:00.158500 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b9680b-512d-497d-9b53-366daa112aa3" containerName="oc" Mar 18 13:22:00 crc kubenswrapper[4975]: I0318 13:22:00.158769 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b9680b-512d-497d-9b53-366daa112aa3" containerName="oc" Mar 18 13:22:00 crc kubenswrapper[4975]: I0318 13:22:00.159541 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-7k22c" Mar 18 13:22:00 crc kubenswrapper[4975]: I0318 13:22:00.163294 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:22:00 crc kubenswrapper[4975]: I0318 13:22:00.166702 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:22:00 crc kubenswrapper[4975]: I0318 13:22:00.168786 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:22:00 crc kubenswrapper[4975]: I0318 13:22:00.169619 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-7k22c"] Mar 18 13:22:00 crc kubenswrapper[4975]: I0318 13:22:00.310161 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjjt\" (UniqueName: \"kubernetes.io/projected/a5e8a646-a265-44f2-80b2-1d37edcb50b7-kube-api-access-9jjjt\") pod \"auto-csr-approver-29564002-7k22c\" (UID: \"a5e8a646-a265-44f2-80b2-1d37edcb50b7\") " pod="openshift-infra/auto-csr-approver-29564002-7k22c" Mar 18 13:22:00 crc kubenswrapper[4975]: I0318 13:22:00.413193 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjjt\" (UniqueName: \"kubernetes.io/projected/a5e8a646-a265-44f2-80b2-1d37edcb50b7-kube-api-access-9jjjt\") pod \"auto-csr-approver-29564002-7k22c\" (UID: \"a5e8a646-a265-44f2-80b2-1d37edcb50b7\") " pod="openshift-infra/auto-csr-approver-29564002-7k22c" Mar 18 13:22:00 crc kubenswrapper[4975]: I0318 13:22:00.431415 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjjt\" (UniqueName: \"kubernetes.io/projected/a5e8a646-a265-44f2-80b2-1d37edcb50b7-kube-api-access-9jjjt\") pod \"auto-csr-approver-29564002-7k22c\" (UID: \"a5e8a646-a265-44f2-80b2-1d37edcb50b7\") " pod="openshift-infra/auto-csr-approver-29564002-7k22c" Mar 18 13:22:00 crc kubenswrapper[4975]: I0318 13:22:00.479154 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-7k22c" Mar 18 13:22:00 crc kubenswrapper[4975]: I0318 13:22:00.983917 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-7k22c"] Mar 18 13:22:02 crc kubenswrapper[4975]: I0318 13:22:02.049950 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564002-7k22c" event={"ID":"a5e8a646-a265-44f2-80b2-1d37edcb50b7","Type":"ContainerStarted","Data":"7d3131979a7372ae3650eb54e654eac839c099c5bfeed720bd62c18fe198afbb"} Mar 18 13:22:03 crc kubenswrapper[4975]: I0318 13:22:03.061502 4975 generic.go:334] "Generic (PLEG): container finished" podID="a5e8a646-a265-44f2-80b2-1d37edcb50b7" containerID="5fac66a63aefb4e57083d7aa4ad774593d6976eec2976ab2804bec85e12e0bf9" exitCode=0 Mar 18 13:22:03 crc kubenswrapper[4975]: I0318 13:22:03.061565 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564002-7k22c" event={"ID":"a5e8a646-a265-44f2-80b2-1d37edcb50b7","Type":"ContainerDied","Data":"5fac66a63aefb4e57083d7aa4ad774593d6976eec2976ab2804bec85e12e0bf9"} Mar 18 13:22:04 crc kubenswrapper[4975]: I0318 13:22:04.377381 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-7k22c" Mar 18 13:22:04 crc kubenswrapper[4975]: I0318 13:22:04.487074 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jjjt\" (UniqueName: \"kubernetes.io/projected/a5e8a646-a265-44f2-80b2-1d37edcb50b7-kube-api-access-9jjjt\") pod \"a5e8a646-a265-44f2-80b2-1d37edcb50b7\" (UID: \"a5e8a646-a265-44f2-80b2-1d37edcb50b7\") " Mar 18 13:22:04 crc kubenswrapper[4975]: I0318 13:22:04.491776 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e8a646-a265-44f2-80b2-1d37edcb50b7-kube-api-access-9jjjt" (OuterVolumeSpecName: "kube-api-access-9jjjt") pod "a5e8a646-a265-44f2-80b2-1d37edcb50b7" (UID: "a5e8a646-a265-44f2-80b2-1d37edcb50b7"). InnerVolumeSpecName "kube-api-access-9jjjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:22:04 crc kubenswrapper[4975]: I0318 13:22:04.589330 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jjjt\" (UniqueName: \"kubernetes.io/projected/a5e8a646-a265-44f2-80b2-1d37edcb50b7-kube-api-access-9jjjt\") on node \"crc\" DevicePath \"\"" Mar 18 13:22:05 crc kubenswrapper[4975]: I0318 13:22:05.081029 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564002-7k22c" event={"ID":"a5e8a646-a265-44f2-80b2-1d37edcb50b7","Type":"ContainerDied","Data":"7d3131979a7372ae3650eb54e654eac839c099c5bfeed720bd62c18fe198afbb"} Mar 18 13:22:05 crc kubenswrapper[4975]: I0318 13:22:05.081386 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d3131979a7372ae3650eb54e654eac839c099c5bfeed720bd62c18fe198afbb" Mar 18 13:22:05 crc kubenswrapper[4975]: I0318 13:22:05.081124 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564002-7k22c" Mar 18 13:22:05 crc kubenswrapper[4975]: I0318 13:22:05.459689 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-ct4tw"] Mar 18 13:22:05 crc kubenswrapper[4975]: I0318 13:22:05.468115 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563996-ct4tw"] Mar 18 13:22:07 crc kubenswrapper[4975]: I0318 13:22:07.029503 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98444df3-0049-4f94-ac53-10a8e2801803" path="/var/lib/kubelet/pods/98444df3-0049-4f94-ac53-10a8e2801803/volumes" Mar 18 13:22:18 crc kubenswrapper[4975]: I0318 13:22:18.866630 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cfw7r"] Mar 18 13:22:18 crc kubenswrapper[4975]: E0318 13:22:18.867605 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e8a646-a265-44f2-80b2-1d37edcb50b7" containerName="oc" Mar 18 13:22:18 crc kubenswrapper[4975]: I0318 13:22:18.867622 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e8a646-a265-44f2-80b2-1d37edcb50b7" containerName="oc" Mar 18 13:22:18 crc kubenswrapper[4975]: I0318 13:22:18.867882 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e8a646-a265-44f2-80b2-1d37edcb50b7" containerName="oc" Mar 18 13:22:18 crc kubenswrapper[4975]: I0318 13:22:18.869444 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:18 crc kubenswrapper[4975]: I0318 13:22:18.875637 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cfw7r"] Mar 18 13:22:19 crc kubenswrapper[4975]: I0318 13:22:19.065069 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-utilities\") pod \"community-operators-cfw7r\" (UID: \"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e\") " pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:19 crc kubenswrapper[4975]: I0318 13:22:19.065126 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-catalog-content\") pod \"community-operators-cfw7r\" (UID: \"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e\") " pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:19 crc kubenswrapper[4975]: I0318 13:22:19.065179 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsjjn\" (UniqueName: \"kubernetes.io/projected/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-kube-api-access-qsjjn\") pod \"community-operators-cfw7r\" (UID: \"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e\") " pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:19 crc kubenswrapper[4975]: I0318 13:22:19.166697 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-utilities\") pod \"community-operators-cfw7r\" (UID: \"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e\") " pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:19 crc kubenswrapper[4975]: I0318 13:22:19.166775 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-catalog-content\") pod \"community-operators-cfw7r\" (UID: \"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e\") " pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:19 crc kubenswrapper[4975]: I0318 13:22:19.166807 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsjjn\" (UniqueName: \"kubernetes.io/projected/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-kube-api-access-qsjjn\") pod \"community-operators-cfw7r\" (UID: \"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e\") " pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:19 crc kubenswrapper[4975]: I0318 13:22:19.167405 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-catalog-content\") pod \"community-operators-cfw7r\" (UID: \"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e\") " pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:19 crc kubenswrapper[4975]: I0318 13:22:19.167850 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-utilities\") pod \"community-operators-cfw7r\" (UID: \"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e\") " pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:19 crc kubenswrapper[4975]: I0318 13:22:19.196653 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsjjn\" (UniqueName: \"kubernetes.io/projected/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-kube-api-access-qsjjn\") pod \"community-operators-cfw7r\" (UID: \"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e\") " pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:19 crc kubenswrapper[4975]: I0318 13:22:19.200166 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:19 crc kubenswrapper[4975]: I0318 13:22:19.691587 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cfw7r"] Mar 18 13:22:19 crc kubenswrapper[4975]: W0318 13:22:19.705029 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d04e1d5_0e0f_4379_a3b6_3d4872e88c9e.slice/crio-764d666b52a85782562f3f1403d2dedc19a25108dfe9b942ca2329ff62fed94d WatchSource:0}: Error finding container 764d666b52a85782562f3f1403d2dedc19a25108dfe9b942ca2329ff62fed94d: Status 404 returned error can't find the container with id 764d666b52a85782562f3f1403d2dedc19a25108dfe9b942ca2329ff62fed94d Mar 18 13:22:20 crc kubenswrapper[4975]: I0318 13:22:20.213027 4975 generic.go:334] "Generic (PLEG): container finished" podID="4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e" containerID="55b856f4b5fb325e847033abef904c508cb715e6be9eeeeeaddc470614cf78fc" exitCode=0 Mar 18 13:22:20 crc kubenswrapper[4975]: I0318 13:22:20.213128 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfw7r" event={"ID":"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e","Type":"ContainerDied","Data":"55b856f4b5fb325e847033abef904c508cb715e6be9eeeeeaddc470614cf78fc"} Mar 18 13:22:20 crc kubenswrapper[4975]: I0318 13:22:20.213324 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfw7r" event={"ID":"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e","Type":"ContainerStarted","Data":"764d666b52a85782562f3f1403d2dedc19a25108dfe9b942ca2329ff62fed94d"} Mar 18 13:22:23 crc kubenswrapper[4975]: I0318 13:22:23.244645 4975 generic.go:334] "Generic (PLEG): container finished" podID="4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e" containerID="8b2db7c157b518b96d024b35335732c978a40c5008560a02b52c9f4fd9615bd1" exitCode=0 Mar 18 13:22:23 crc kubenswrapper[4975]: I0318 13:22:23.244736 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfw7r" event={"ID":"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e","Type":"ContainerDied","Data":"8b2db7c157b518b96d024b35335732c978a40c5008560a02b52c9f4fd9615bd1"} Mar 18 13:22:24 crc kubenswrapper[4975]: I0318 13:22:24.256204 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfw7r" event={"ID":"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e","Type":"ContainerStarted","Data":"4c08b9f78764fcc0574bff0cb549b68ba4460679399d37d093d0df381bdebb41"} Mar 18 13:22:24 crc kubenswrapper[4975]: I0318 13:22:24.278096 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cfw7r" podStartSLOduration=2.84649994 podStartE2EDuration="6.278049905s" podCreationTimestamp="2026-03-18 13:22:18 +0000 UTC" firstStartedPulling="2026-03-18 13:22:20.214560982 +0000 UTC m=+4325.928961561" lastFinishedPulling="2026-03-18 13:22:23.646110947 +0000 UTC m=+4329.360511526" observedRunningTime="2026-03-18 13:22:24.273606205 +0000 UTC m=+4329.988006784" watchObservedRunningTime="2026-03-18 13:22:24.278049905 +0000 UTC m=+4329.992450494" Mar 18 13:22:25 crc kubenswrapper[4975]: I0318 13:22:25.539004 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:22:25 crc kubenswrapper[4975]: I0318 13:22:25.539313 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:22:25 crc kubenswrapper[4975]: I0318 13:22:25.539366 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 13:22:25 crc kubenswrapper[4975]: I0318 13:22:25.540087 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:22:25 crc kubenswrapper[4975]: I0318 13:22:25.540132 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" gracePeriod=600 Mar 18 13:22:25 crc kubenswrapper[4975]: E0318 13:22:25.679805 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:22:26 crc kubenswrapper[4975]: I0318 13:22:26.278989 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" exitCode=0 Mar 18 13:22:26 crc kubenswrapper[4975]: I0318 13:22:26.279082 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea"} Mar 18 13:22:26 crc kubenswrapper[4975]: I0318 13:22:26.279375 4975 scope.go:117] "RemoveContainer" containerID="a6d49e88f39747f8437850d0e06984a0c76dfd4bda387e9721407ad29e85a872" Mar 18 13:22:26 crc kubenswrapper[4975]: I0318 13:22:26.280355 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:22:26 crc kubenswrapper[4975]: E0318 13:22:26.280982 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:22:29 crc kubenswrapper[4975]: I0318 13:22:29.201776 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:29 crc kubenswrapper[4975]: I0318 13:22:29.202114 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:29 crc kubenswrapper[4975]: I0318 13:22:29.250387 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:29 crc kubenswrapper[4975]: I0318 13:22:29.355197 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:29 crc kubenswrapper[4975]: I0318 13:22:29.488156 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cfw7r"] Mar 18 13:22:31 crc kubenswrapper[4975]: I0318 13:22:31.326419 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cfw7r" podUID="4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e" containerName="registry-server" containerID="cri-o://4c08b9f78764fcc0574bff0cb549b68ba4460679399d37d093d0df381bdebb41" gracePeriod=2 Mar 18 13:22:32 crc kubenswrapper[4975]: I0318 13:22:32.338228 4975 generic.go:334] "Generic (PLEG): container finished" podID="4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e" containerID="4c08b9f78764fcc0574bff0cb549b68ba4460679399d37d093d0df381bdebb41" exitCode=0 Mar 18 13:22:32 crc kubenswrapper[4975]: I0318 13:22:32.338405 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfw7r" event={"ID":"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e","Type":"ContainerDied","Data":"4c08b9f78764fcc0574bff0cb549b68ba4460679399d37d093d0df381bdebb41"} Mar 18 13:22:32 crc kubenswrapper[4975]: I0318 13:22:32.453942 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:32 crc kubenswrapper[4975]: I0318 13:22:32.528960 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsjjn\" (UniqueName: \"kubernetes.io/projected/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-kube-api-access-qsjjn\") pod \"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e\" (UID: \"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e\") " Mar 18 13:22:32 crc kubenswrapper[4975]: I0318 13:22:32.529274 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-catalog-content\") pod \"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e\" (UID: \"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e\") " Mar 18 13:22:32 crc kubenswrapper[4975]: I0318 13:22:32.529401 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-utilities\") pod \"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e\" (UID: \"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e\") " Mar 18 13:22:32 crc kubenswrapper[4975]: I0318 13:22:32.530488 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-utilities" (OuterVolumeSpecName: "utilities") pod "4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e" (UID: "4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:22:32 crc kubenswrapper[4975]: I0318 13:22:32.537772 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-kube-api-access-qsjjn" (OuterVolumeSpecName: "kube-api-access-qsjjn") pod "4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e" (UID: "4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e"). InnerVolumeSpecName "kube-api-access-qsjjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:22:32 crc kubenswrapper[4975]: I0318 13:22:32.603288 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e" (UID: "4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:22:32 crc kubenswrapper[4975]: I0318 13:22:32.632638 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsjjn\" (UniqueName: \"kubernetes.io/projected/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-kube-api-access-qsjjn\") on node \"crc\" DevicePath \"\"" Mar 18 13:22:32 crc kubenswrapper[4975]: I0318 13:22:32.632837 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:22:32 crc kubenswrapper[4975]: I0318 13:22:32.632904 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:22:33 crc kubenswrapper[4975]: I0318 13:22:33.349391 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfw7r" event={"ID":"4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e","Type":"ContainerDied","Data":"764d666b52a85782562f3f1403d2dedc19a25108dfe9b942ca2329ff62fed94d"} Mar 18 13:22:33 crc kubenswrapper[4975]: I0318 13:22:33.349454 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfw7r" Mar 18 13:22:33 crc kubenswrapper[4975]: I0318 13:22:33.349459 4975 scope.go:117] "RemoveContainer" containerID="4c08b9f78764fcc0574bff0cb549b68ba4460679399d37d093d0df381bdebb41" Mar 18 13:22:33 crc kubenswrapper[4975]: I0318 13:22:33.373898 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cfw7r"] Mar 18 13:22:33 crc kubenswrapper[4975]: I0318 13:22:33.388661 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cfw7r"] Mar 18 13:22:33 crc kubenswrapper[4975]: I0318 13:22:33.392635 4975 scope.go:117] "RemoveContainer" containerID="8b2db7c157b518b96d024b35335732c978a40c5008560a02b52c9f4fd9615bd1" Mar 18 13:22:33 crc kubenswrapper[4975]: I0318 13:22:33.420282 4975 scope.go:117] "RemoveContainer" containerID="55b856f4b5fb325e847033abef904c508cb715e6be9eeeeeaddc470614cf78fc" Mar 18 13:22:35 crc kubenswrapper[4975]: I0318 13:22:35.029005 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e" path="/var/lib/kubelet/pods/4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e/volumes" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.017335 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:22:37 crc kubenswrapper[4975]: E0318 13:22:37.017988 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.066774 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-28tjn"] Mar 18 13:22:37 crc kubenswrapper[4975]: E0318 13:22:37.067264 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e" containerName="extract-utilities" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.067286 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e" containerName="extract-utilities" Mar 18 13:22:37 crc kubenswrapper[4975]: E0318 13:22:37.067317 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e" containerName="extract-content" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.067326 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e" containerName="extract-content" Mar 18 13:22:37 crc kubenswrapper[4975]: E0318 13:22:37.067341 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e" containerName="registry-server" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.067349 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e" containerName="registry-server" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.067574 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d04e1d5-0e0f-4379-a3b6-3d4872e88c9e" containerName="registry-server" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.071358 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.080566 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-28tjn"] Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.121736 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e1d8039-011b-4e8d-a744-dc6469618634-catalog-content\") pod \"redhat-marketplace-28tjn\" (UID: \"4e1d8039-011b-4e8d-a744-dc6469618634\") " pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.121976 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zrls\" (UniqueName: \"kubernetes.io/projected/4e1d8039-011b-4e8d-a744-dc6469618634-kube-api-access-7zrls\") pod \"redhat-marketplace-28tjn\" (UID: \"4e1d8039-011b-4e8d-a744-dc6469618634\") " pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.122147 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e1d8039-011b-4e8d-a744-dc6469618634-utilities\") pod \"redhat-marketplace-28tjn\" (UID: \"4e1d8039-011b-4e8d-a744-dc6469618634\") " pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.224269 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e1d8039-011b-4e8d-a744-dc6469618634-utilities\") pod \"redhat-marketplace-28tjn\" (UID: \"4e1d8039-011b-4e8d-a744-dc6469618634\") " pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.224458 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e1d8039-011b-4e8d-a744-dc6469618634-catalog-content\") pod \"redhat-marketplace-28tjn\" (UID: \"4e1d8039-011b-4e8d-a744-dc6469618634\") " pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.224588 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zrls\" (UniqueName: \"kubernetes.io/projected/4e1d8039-011b-4e8d-a744-dc6469618634-kube-api-access-7zrls\") pod \"redhat-marketplace-28tjn\" (UID: \"4e1d8039-011b-4e8d-a744-dc6469618634\") " pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.224892 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e1d8039-011b-4e8d-a744-dc6469618634-utilities\") pod \"redhat-marketplace-28tjn\" (UID: \"4e1d8039-011b-4e8d-a744-dc6469618634\") " pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.225151 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e1d8039-011b-4e8d-a744-dc6469618634-catalog-content\") pod \"redhat-marketplace-28tjn\" (UID: \"4e1d8039-011b-4e8d-a744-dc6469618634\") " pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.463668 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zrls\" (UniqueName: \"kubernetes.io/projected/4e1d8039-011b-4e8d-a744-dc6469618634-kube-api-access-7zrls\") pod \"redhat-marketplace-28tjn\" (UID: \"4e1d8039-011b-4e8d-a744-dc6469618634\") " pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:37 crc kubenswrapper[4975]: I0318 13:22:37.725095 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:38 crc kubenswrapper[4975]: I0318 13:22:38.218312 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-28tjn"] Mar 18 13:22:38 crc kubenswrapper[4975]: I0318 13:22:38.389078 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28tjn" event={"ID":"4e1d8039-011b-4e8d-a744-dc6469618634","Type":"ContainerStarted","Data":"7c98261274e3bb9ad8e55cb5e8ae0c56da2fb16d1ca1423aa0ea0a8c0f6845c2"} Mar 18 13:22:39 crc kubenswrapper[4975]: I0318 13:22:39.402143 4975 generic.go:334] "Generic (PLEG): container finished" podID="4e1d8039-011b-4e8d-a744-dc6469618634" containerID="d8921ec5100704759d0def2d04a1344f4773412e21b3a445c39a288d6f41f50a" exitCode=0 Mar 18 13:22:39 crc kubenswrapper[4975]: I0318 13:22:39.402414 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28tjn" event={"ID":"4e1d8039-011b-4e8d-a744-dc6469618634","Type":"ContainerDied","Data":"d8921ec5100704759d0def2d04a1344f4773412e21b3a445c39a288d6f41f50a"} Mar 18 13:22:40 crc kubenswrapper[4975]: I0318 13:22:40.417744 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28tjn" event={"ID":"4e1d8039-011b-4e8d-a744-dc6469618634","Type":"ContainerStarted","Data":"45324e8caeec059ad3f0c8f521d8f3de36e96f988c56bdc3e4a4edb2bc440e7e"} Mar 18 13:22:41 crc kubenswrapper[4975]: I0318 13:22:41.430793 4975 generic.go:334] "Generic (PLEG): container finished" podID="4e1d8039-011b-4e8d-a744-dc6469618634" containerID="45324e8caeec059ad3f0c8f521d8f3de36e96f988c56bdc3e4a4edb2bc440e7e" exitCode=0 Mar 18 13:22:41 crc kubenswrapper[4975]: I0318 13:22:41.430836 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28tjn" event={"ID":"4e1d8039-011b-4e8d-a744-dc6469618634","Type":"ContainerDied","Data":"45324e8caeec059ad3f0c8f521d8f3de36e96f988c56bdc3e4a4edb2bc440e7e"} Mar 18 13:22:42 crc kubenswrapper[4975]: I0318 13:22:42.443531 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28tjn" event={"ID":"4e1d8039-011b-4e8d-a744-dc6469618634","Type":"ContainerStarted","Data":"536c0a32495ebec88bdf553b75e0fd8f537e91f65d14171a073dc024f9ccec20"} Mar 18 13:22:42 crc kubenswrapper[4975]: I0318 13:22:42.463530 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-28tjn" podStartSLOduration=2.708740224 podStartE2EDuration="5.463512165s" podCreationTimestamp="2026-03-18 13:22:37 +0000 UTC" firstStartedPulling="2026-03-18 13:22:39.408596858 +0000 UTC m=+4345.122997477" lastFinishedPulling="2026-03-18 13:22:42.163368849 +0000 UTC m=+4347.877769418" observedRunningTime="2026-03-18 13:22:42.457622256 +0000 UTC m=+4348.172022845" watchObservedRunningTime="2026-03-18 13:22:42.463512165 +0000 UTC m=+4348.177912744" Mar 18 13:22:47 crc kubenswrapper[4975]: I0318 13:22:47.725669 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:47 crc kubenswrapper[4975]: I0318 13:22:47.726206 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:47 crc kubenswrapper[4975]: I0318 13:22:47.778277 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:48 crc kubenswrapper[4975]: I0318 13:22:48.558360 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:48 crc kubenswrapper[4975]: I0318 13:22:48.606655 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-28tjn"] Mar 18 13:22:49 crc kubenswrapper[4975]: I0318 13:22:49.594204 4975 scope.go:117] "RemoveContainer" containerID="d826d199efabc11f7632a1d31c271bd8cd911d373d88660a1a8efb1bf8398c7e" Mar 18 13:22:50 crc kubenswrapper[4975]: I0318 13:22:50.016782 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:22:50 crc kubenswrapper[4975]: E0318 13:22:50.017057 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:22:50 crc kubenswrapper[4975]: I0318 13:22:50.525904 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-28tjn" podUID="4e1d8039-011b-4e8d-a744-dc6469618634" containerName="registry-server" containerID="cri-o://536c0a32495ebec88bdf553b75e0fd8f537e91f65d14171a073dc024f9ccec20" gracePeriod=2 Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.012505 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.177414 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e1d8039-011b-4e8d-a744-dc6469618634-catalog-content\") pod \"4e1d8039-011b-4e8d-a744-dc6469618634\" (UID: \"4e1d8039-011b-4e8d-a744-dc6469618634\") " Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.177460 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zrls\" (UniqueName: \"kubernetes.io/projected/4e1d8039-011b-4e8d-a744-dc6469618634-kube-api-access-7zrls\") pod \"4e1d8039-011b-4e8d-a744-dc6469618634\" (UID: \"4e1d8039-011b-4e8d-a744-dc6469618634\") " Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.177533 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e1d8039-011b-4e8d-a744-dc6469618634-utilities\") pod \"4e1d8039-011b-4e8d-a744-dc6469618634\" (UID: \"4e1d8039-011b-4e8d-a744-dc6469618634\") " Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.180130 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e1d8039-011b-4e8d-a744-dc6469618634-utilities" (OuterVolumeSpecName: "utilities") pod "4e1d8039-011b-4e8d-a744-dc6469618634" (UID: "4e1d8039-011b-4e8d-a744-dc6469618634"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.186185 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1d8039-011b-4e8d-a744-dc6469618634-kube-api-access-7zrls" (OuterVolumeSpecName: "kube-api-access-7zrls") pod "4e1d8039-011b-4e8d-a744-dc6469618634" (UID: "4e1d8039-011b-4e8d-a744-dc6469618634"). InnerVolumeSpecName "kube-api-access-7zrls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.217439 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e1d8039-011b-4e8d-a744-dc6469618634-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e1d8039-011b-4e8d-a744-dc6469618634" (UID: "4e1d8039-011b-4e8d-a744-dc6469618634"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.279937 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e1d8039-011b-4e8d-a744-dc6469618634-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.279993 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e1d8039-011b-4e8d-a744-dc6469618634-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.280018 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zrls\" (UniqueName: \"kubernetes.io/projected/4e1d8039-011b-4e8d-a744-dc6469618634-kube-api-access-7zrls\") on node \"crc\" DevicePath \"\"" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.535495 4975 generic.go:334] "Generic (PLEG): container finished" podID="4e1d8039-011b-4e8d-a744-dc6469618634" containerID="536c0a32495ebec88bdf553b75e0fd8f537e91f65d14171a073dc024f9ccec20" exitCode=0 Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.535534 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28tjn" event={"ID":"4e1d8039-011b-4e8d-a744-dc6469618634","Type":"ContainerDied","Data":"536c0a32495ebec88bdf553b75e0fd8f537e91f65d14171a073dc024f9ccec20"} Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.535560 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28tjn" event={"ID":"4e1d8039-011b-4e8d-a744-dc6469618634","Type":"ContainerDied","Data":"7c98261274e3bb9ad8e55cb5e8ae0c56da2fb16d1ca1423aa0ea0a8c0f6845c2"} Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.535565 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28tjn" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.535575 4975 scope.go:117] "RemoveContainer" containerID="536c0a32495ebec88bdf553b75e0fd8f537e91f65d14171a073dc024f9ccec20" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.559513 4975 scope.go:117] "RemoveContainer" containerID="45324e8caeec059ad3f0c8f521d8f3de36e96f988c56bdc3e4a4edb2bc440e7e" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.581263 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-28tjn"] Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.598809 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-28tjn"] Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.600332 4975 scope.go:117] "RemoveContainer" containerID="d8921ec5100704759d0def2d04a1344f4773412e21b3a445c39a288d6f41f50a" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.622531 4975 scope.go:117] "RemoveContainer" containerID="536c0a32495ebec88bdf553b75e0fd8f537e91f65d14171a073dc024f9ccec20" Mar 18 13:22:51 crc kubenswrapper[4975]: E0318 13:22:51.623088 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"536c0a32495ebec88bdf553b75e0fd8f537e91f65d14171a073dc024f9ccec20\": container with ID starting with 536c0a32495ebec88bdf553b75e0fd8f537e91f65d14171a073dc024f9ccec20 not found: ID does not exist" containerID="536c0a32495ebec88bdf553b75e0fd8f537e91f65d14171a073dc024f9ccec20" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.623134 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"536c0a32495ebec88bdf553b75e0fd8f537e91f65d14171a073dc024f9ccec20"} err="failed to get container status \"536c0a32495ebec88bdf553b75e0fd8f537e91f65d14171a073dc024f9ccec20\": rpc error: code = NotFound desc = could not find container \"536c0a32495ebec88bdf553b75e0fd8f537e91f65d14171a073dc024f9ccec20\": container with ID starting with 536c0a32495ebec88bdf553b75e0fd8f537e91f65d14171a073dc024f9ccec20 not found: ID does not exist" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.623165 4975 scope.go:117] "RemoveContainer" containerID="45324e8caeec059ad3f0c8f521d8f3de36e96f988c56bdc3e4a4edb2bc440e7e" Mar 18 13:22:51 crc kubenswrapper[4975]: E0318 13:22:51.623710 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45324e8caeec059ad3f0c8f521d8f3de36e96f988c56bdc3e4a4edb2bc440e7e\": container with ID starting with 45324e8caeec059ad3f0c8f521d8f3de36e96f988c56bdc3e4a4edb2bc440e7e not found: ID does not exist" containerID="45324e8caeec059ad3f0c8f521d8f3de36e96f988c56bdc3e4a4edb2bc440e7e" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.623758 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45324e8caeec059ad3f0c8f521d8f3de36e96f988c56bdc3e4a4edb2bc440e7e"} err="failed to get container status \"45324e8caeec059ad3f0c8f521d8f3de36e96f988c56bdc3e4a4edb2bc440e7e\": rpc error: code = NotFound desc = could not find container \"45324e8caeec059ad3f0c8f521d8f3de36e96f988c56bdc3e4a4edb2bc440e7e\": container with ID starting with 45324e8caeec059ad3f0c8f521d8f3de36e96f988c56bdc3e4a4edb2bc440e7e not found: ID does not exist" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.623784 4975 scope.go:117] "RemoveContainer" containerID="d8921ec5100704759d0def2d04a1344f4773412e21b3a445c39a288d6f41f50a" Mar 18 13:22:51 crc kubenswrapper[4975]: E0318 13:22:51.624136 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8921ec5100704759d0def2d04a1344f4773412e21b3a445c39a288d6f41f50a\": container with ID starting with d8921ec5100704759d0def2d04a1344f4773412e21b3a445c39a288d6f41f50a not found: ID does not exist" containerID="d8921ec5100704759d0def2d04a1344f4773412e21b3a445c39a288d6f41f50a" Mar 18 13:22:51 crc kubenswrapper[4975]: I0318 13:22:51.624175 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8921ec5100704759d0def2d04a1344f4773412e21b3a445c39a288d6f41f50a"} err="failed to get container status \"d8921ec5100704759d0def2d04a1344f4773412e21b3a445c39a288d6f41f50a\": rpc error: code = NotFound desc = could not find container \"d8921ec5100704759d0def2d04a1344f4773412e21b3a445c39a288d6f41f50a\": container with ID starting with d8921ec5100704759d0def2d04a1344f4773412e21b3a445c39a288d6f41f50a not found: ID does not exist" Mar 18 13:22:53 crc kubenswrapper[4975]: I0318 13:22:53.028799 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e1d8039-011b-4e8d-a744-dc6469618634" path="/var/lib/kubelet/pods/4e1d8039-011b-4e8d-a744-dc6469618634/volumes" Mar 18 13:23:01 crc kubenswrapper[4975]: I0318 13:23:01.016765 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:23:01 crc kubenswrapper[4975]: E0318 13:23:01.017445 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.247897 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2ccnd"] Mar 18 13:23:03 crc kubenswrapper[4975]: E0318 13:23:03.248609 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1d8039-011b-4e8d-a744-dc6469618634" containerName="extract-utilities" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.248622 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1d8039-011b-4e8d-a744-dc6469618634" containerName="extract-utilities" Mar 18 13:23:03 crc kubenswrapper[4975]: E0318 13:23:03.248639 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1d8039-011b-4e8d-a744-dc6469618634" containerName="extract-content" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.248646 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1d8039-011b-4e8d-a744-dc6469618634" containerName="extract-content" Mar 18 13:23:03 crc kubenswrapper[4975]: E0318 13:23:03.248674 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1d8039-011b-4e8d-a744-dc6469618634" containerName="registry-server" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.248681 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1d8039-011b-4e8d-a744-dc6469618634" containerName="registry-server" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.248920 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1d8039-011b-4e8d-a744-dc6469618634" containerName="registry-server" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.250164 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.259634 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2ccnd"] Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.260629 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24444b7-c911-43c5-ae61-44231d591491-catalog-content\") pod \"certified-operators-2ccnd\" (UID: \"c24444b7-c911-43c5-ae61-44231d591491\") " pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.260707 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24444b7-c911-43c5-ae61-44231d591491-utilities\") pod \"certified-operators-2ccnd\" (UID: \"c24444b7-c911-43c5-ae61-44231d591491\") " pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.260966 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgrj6\" (UniqueName: \"kubernetes.io/projected/c24444b7-c911-43c5-ae61-44231d591491-kube-api-access-bgrj6\") pod \"certified-operators-2ccnd\" (UID: \"c24444b7-c911-43c5-ae61-44231d591491\") " pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.363542 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgrj6\" (UniqueName: \"kubernetes.io/projected/c24444b7-c911-43c5-ae61-44231d591491-kube-api-access-bgrj6\") pod \"certified-operators-2ccnd\" (UID: \"c24444b7-c911-43c5-ae61-44231d591491\") " pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.363651 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24444b7-c911-43c5-ae61-44231d591491-catalog-content\") pod \"certified-operators-2ccnd\" (UID: \"c24444b7-c911-43c5-ae61-44231d591491\") " pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.363695 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24444b7-c911-43c5-ae61-44231d591491-utilities\") pod \"certified-operators-2ccnd\" (UID: \"c24444b7-c911-43c5-ae61-44231d591491\") " pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.365332 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24444b7-c911-43c5-ae61-44231d591491-utilities\") pod \"certified-operators-2ccnd\" (UID: \"c24444b7-c911-43c5-ae61-44231d591491\") " pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.365919 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24444b7-c911-43c5-ae61-44231d591491-catalog-content\") pod \"certified-operators-2ccnd\" (UID: \"c24444b7-c911-43c5-ae61-44231d591491\") " pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.385742 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgrj6\" (UniqueName: \"kubernetes.io/projected/c24444b7-c911-43c5-ae61-44231d591491-kube-api-access-bgrj6\") pod \"certified-operators-2ccnd\" (UID: \"c24444b7-c911-43c5-ae61-44231d591491\") " pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:03 crc kubenswrapper[4975]: I0318 13:23:03.572704 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:04 crc kubenswrapper[4975]: I0318 13:23:04.024784 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2ccnd"] Mar 18 13:23:04 crc kubenswrapper[4975]: I0318 13:23:04.699469 4975 generic.go:334] "Generic (PLEG): container finished" podID="c24444b7-c911-43c5-ae61-44231d591491" containerID="430d267d3f1991c17c1c5009b77743b1d35887923d56c9a03b8395f9eaece68b" exitCode=0 Mar 18 13:23:04 crc kubenswrapper[4975]: I0318 13:23:04.699514 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ccnd" event={"ID":"c24444b7-c911-43c5-ae61-44231d591491","Type":"ContainerDied","Data":"430d267d3f1991c17c1c5009b77743b1d35887923d56c9a03b8395f9eaece68b"} Mar 18 13:23:04 crc kubenswrapper[4975]: I0318 13:23:04.699540 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ccnd" event={"ID":"c24444b7-c911-43c5-ae61-44231d591491","Type":"ContainerStarted","Data":"e5b32be5047984820683484e0c6b5e766d2d0f2521050547f83a13d7f60bbef5"} Mar 18 13:23:06 crc kubenswrapper[4975]: I0318 13:23:06.717751 4975 generic.go:334] "Generic (PLEG): container finished" podID="c24444b7-c911-43c5-ae61-44231d591491" containerID="02a682339ea4b2bc6001e90ca82be2374aa71f37b20be214cb6b30fdfa168158" exitCode=0 Mar 18 13:23:06 crc kubenswrapper[4975]: I0318 13:23:06.717927 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ccnd" event={"ID":"c24444b7-c911-43c5-ae61-44231d591491","Type":"ContainerDied","Data":"02a682339ea4b2bc6001e90ca82be2374aa71f37b20be214cb6b30fdfa168158"} Mar 18 13:23:08 crc kubenswrapper[4975]: I0318 13:23:08.735083 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ccnd" event={"ID":"c24444b7-c911-43c5-ae61-44231d591491","Type":"ContainerStarted","Data":"d2cc115cc51105d468010777b6b1ca33d189a6c7e30e1d0ea3ff46686cdeacae"} Mar 18 13:23:08 crc kubenswrapper[4975]: I0318 13:23:08.757007 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2ccnd" podStartSLOduration=2.287417825 podStartE2EDuration="5.756982951s" podCreationTimestamp="2026-03-18 13:23:03 +0000 UTC" firstStartedPulling="2026-03-18 13:23:04.701233994 +0000 UTC m=+4370.415634573" lastFinishedPulling="2026-03-18 13:23:08.1707991 +0000 UTC m=+4373.885199699" observedRunningTime="2026-03-18 13:23:08.752411477 +0000 UTC m=+4374.466812086" watchObservedRunningTime="2026-03-18 13:23:08.756982951 +0000 UTC m=+4374.471383550" Mar 18 13:23:12 crc kubenswrapper[4975]: I0318 13:23:12.017534 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:23:12 crc kubenswrapper[4975]: E0318 13:23:12.018182 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:23:13 crc kubenswrapper[4975]: I0318 13:23:13.573723 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:13 crc kubenswrapper[4975]: I0318 13:23:13.574102 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:13 crc kubenswrapper[4975]: I0318 13:23:13.616638 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:13 crc kubenswrapper[4975]: I0318 13:23:13.841308 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:13 crc kubenswrapper[4975]: I0318 13:23:13.887044 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2ccnd"] Mar 18 13:23:15 crc kubenswrapper[4975]: I0318 13:23:15.811568 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2ccnd" podUID="c24444b7-c911-43c5-ae61-44231d591491" containerName="registry-server" containerID="cri-o://d2cc115cc51105d468010777b6b1ca33d189a6c7e30e1d0ea3ff46686cdeacae" gracePeriod=2 Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.286062 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.431530 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgrj6\" (UniqueName: \"kubernetes.io/projected/c24444b7-c911-43c5-ae61-44231d591491-kube-api-access-bgrj6\") pod \"c24444b7-c911-43c5-ae61-44231d591491\" (UID: \"c24444b7-c911-43c5-ae61-44231d591491\") " Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.432010 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24444b7-c911-43c5-ae61-44231d591491-utilities\") pod \"c24444b7-c911-43c5-ae61-44231d591491\" (UID: \"c24444b7-c911-43c5-ae61-44231d591491\") " Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.432366 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24444b7-c911-43c5-ae61-44231d591491-catalog-content\") pod \"c24444b7-c911-43c5-ae61-44231d591491\" (UID: \"c24444b7-c911-43c5-ae61-44231d591491\") " Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.433681 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c24444b7-c911-43c5-ae61-44231d591491-utilities" (OuterVolumeSpecName: "utilities") pod "c24444b7-c911-43c5-ae61-44231d591491" (UID: "c24444b7-c911-43c5-ae61-44231d591491"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.442368 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c24444b7-c911-43c5-ae61-44231d591491-kube-api-access-bgrj6" (OuterVolumeSpecName: "kube-api-access-bgrj6") pod "c24444b7-c911-43c5-ae61-44231d591491" (UID: "c24444b7-c911-43c5-ae61-44231d591491"). InnerVolumeSpecName "kube-api-access-bgrj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.486212 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c24444b7-c911-43c5-ae61-44231d591491-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c24444b7-c911-43c5-ae61-44231d591491" (UID: "c24444b7-c911-43c5-ae61-44231d591491"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.535103 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c24444b7-c911-43c5-ae61-44231d591491-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.535173 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgrj6\" (UniqueName: \"kubernetes.io/projected/c24444b7-c911-43c5-ae61-44231d591491-kube-api-access-bgrj6\") on node \"crc\" DevicePath \"\"" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.535187 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c24444b7-c911-43c5-ae61-44231d591491-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.821982 4975 generic.go:334] "Generic (PLEG): container finished" podID="c24444b7-c911-43c5-ae61-44231d591491" containerID="d2cc115cc51105d468010777b6b1ca33d189a6c7e30e1d0ea3ff46686cdeacae" exitCode=0 Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.822057 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ccnd" event={"ID":"c24444b7-c911-43c5-ae61-44231d591491","Type":"ContainerDied","Data":"d2cc115cc51105d468010777b6b1ca33d189a6c7e30e1d0ea3ff46686cdeacae"} Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.822092 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ccnd" event={"ID":"c24444b7-c911-43c5-ae61-44231d591491","Type":"ContainerDied","Data":"e5b32be5047984820683484e0c6b5e766d2d0f2521050547f83a13d7f60bbef5"} Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.822125 4975 scope.go:117] "RemoveContainer" containerID="d2cc115cc51105d468010777b6b1ca33d189a6c7e30e1d0ea3ff46686cdeacae" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.822291 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ccnd" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.842680 4975 scope.go:117] "RemoveContainer" containerID="02a682339ea4b2bc6001e90ca82be2374aa71f37b20be214cb6b30fdfa168158" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.946933 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2ccnd"] Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.952524 4975 scope.go:117] "RemoveContainer" containerID="430d267d3f1991c17c1c5009b77743b1d35887923d56c9a03b8395f9eaece68b" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.957252 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2ccnd"] Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.982980 4975 scope.go:117] "RemoveContainer" containerID="d2cc115cc51105d468010777b6b1ca33d189a6c7e30e1d0ea3ff46686cdeacae" Mar 18 13:23:16 crc kubenswrapper[4975]: E0318 13:23:16.983441 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2cc115cc51105d468010777b6b1ca33d189a6c7e30e1d0ea3ff46686cdeacae\": container with ID starting with d2cc115cc51105d468010777b6b1ca33d189a6c7e30e1d0ea3ff46686cdeacae not found: ID does not exist" containerID="d2cc115cc51105d468010777b6b1ca33d189a6c7e30e1d0ea3ff46686cdeacae" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.983511 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2cc115cc51105d468010777b6b1ca33d189a6c7e30e1d0ea3ff46686cdeacae"} err="failed to get container status \"d2cc115cc51105d468010777b6b1ca33d189a6c7e30e1d0ea3ff46686cdeacae\": rpc error: code = NotFound desc = could not find container \"d2cc115cc51105d468010777b6b1ca33d189a6c7e30e1d0ea3ff46686cdeacae\": container with ID starting with d2cc115cc51105d468010777b6b1ca33d189a6c7e30e1d0ea3ff46686cdeacae not found: ID does not exist" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.983543 4975 scope.go:117] "RemoveContainer" containerID="02a682339ea4b2bc6001e90ca82be2374aa71f37b20be214cb6b30fdfa168158" Mar 18 13:23:16 crc kubenswrapper[4975]: E0318 13:23:16.983941 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02a682339ea4b2bc6001e90ca82be2374aa71f37b20be214cb6b30fdfa168158\": container with ID starting with 02a682339ea4b2bc6001e90ca82be2374aa71f37b20be214cb6b30fdfa168158 not found: ID does not exist" containerID="02a682339ea4b2bc6001e90ca82be2374aa71f37b20be214cb6b30fdfa168158" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.984004 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a682339ea4b2bc6001e90ca82be2374aa71f37b20be214cb6b30fdfa168158"} err="failed to get container status \"02a682339ea4b2bc6001e90ca82be2374aa71f37b20be214cb6b30fdfa168158\": rpc error: code = NotFound desc = could not find container \"02a682339ea4b2bc6001e90ca82be2374aa71f37b20be214cb6b30fdfa168158\": container with ID starting with 02a682339ea4b2bc6001e90ca82be2374aa71f37b20be214cb6b30fdfa168158 not found: ID does not exist" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.984036 4975 scope.go:117] "RemoveContainer" containerID="430d267d3f1991c17c1c5009b77743b1d35887923d56c9a03b8395f9eaece68b" Mar 18 13:23:16 crc kubenswrapper[4975]: E0318 13:23:16.984425 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"430d267d3f1991c17c1c5009b77743b1d35887923d56c9a03b8395f9eaece68b\": container with ID starting with 430d267d3f1991c17c1c5009b77743b1d35887923d56c9a03b8395f9eaece68b not found: ID does not exist" containerID="430d267d3f1991c17c1c5009b77743b1d35887923d56c9a03b8395f9eaece68b" Mar 18 13:23:16 crc kubenswrapper[4975]: I0318 13:23:16.984600 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"430d267d3f1991c17c1c5009b77743b1d35887923d56c9a03b8395f9eaece68b"} err="failed to get container status \"430d267d3f1991c17c1c5009b77743b1d35887923d56c9a03b8395f9eaece68b\": rpc error: code = NotFound desc = could not find container \"430d267d3f1991c17c1c5009b77743b1d35887923d56c9a03b8395f9eaece68b\": container with ID starting with 430d267d3f1991c17c1c5009b77743b1d35887923d56c9a03b8395f9eaece68b not found: ID does not exist" Mar 18 13:23:17 crc kubenswrapper[4975]: I0318 13:23:17.029646 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24444b7-c911-43c5-ae61-44231d591491" path="/var/lib/kubelet/pods/c24444b7-c911-43c5-ae61-44231d591491/volumes" Mar 18 13:23:26 crc kubenswrapper[4975]: I0318 13:23:26.017831 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:23:26 crc kubenswrapper[4975]: E0318 13:23:26.018671 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:23:40 crc kubenswrapper[4975]: I0318 13:23:40.017052 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:23:40 crc kubenswrapper[4975]: E0318 13:23:40.017997 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.180474 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2qnpd"] Mar 18 13:23:44 crc kubenswrapper[4975]: E0318 13:23:44.180964 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24444b7-c911-43c5-ae61-44231d591491" containerName="extract-content" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.180980 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24444b7-c911-43c5-ae61-44231d591491" containerName="extract-content" Mar 18 13:23:44 crc kubenswrapper[4975]: E0318 13:23:44.181016 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24444b7-c911-43c5-ae61-44231d591491" containerName="extract-utilities" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.181022 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24444b7-c911-43c5-ae61-44231d591491" containerName="extract-utilities" Mar 18 13:23:44 crc kubenswrapper[4975]: E0318 13:23:44.181043 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24444b7-c911-43c5-ae61-44231d591491" containerName="registry-server" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.181049 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24444b7-c911-43c5-ae61-44231d591491" containerName="registry-server" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.181259 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24444b7-c911-43c5-ae61-44231d591491" containerName="registry-server" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.182849 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.210957 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2qnpd"] Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.368888 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e305dff4-0550-4137-8fab-c84f5f0b59db-catalog-content\") pod \"redhat-operators-2qnpd\" (UID: \"e305dff4-0550-4137-8fab-c84f5f0b59db\") " pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.369078 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e305dff4-0550-4137-8fab-c84f5f0b59db-utilities\") pod \"redhat-operators-2qnpd\" (UID: \"e305dff4-0550-4137-8fab-c84f5f0b59db\") " pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.369124 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vnvk\" (UniqueName: \"kubernetes.io/projected/e305dff4-0550-4137-8fab-c84f5f0b59db-kube-api-access-4vnvk\") pod \"redhat-operators-2qnpd\" (UID: \"e305dff4-0550-4137-8fab-c84f5f0b59db\") " pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.471319 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e305dff4-0550-4137-8fab-c84f5f0b59db-catalog-content\") pod \"redhat-operators-2qnpd\" (UID: \"e305dff4-0550-4137-8fab-c84f5f0b59db\") " pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.471444 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e305dff4-0550-4137-8fab-c84f5f0b59db-utilities\") pod \"redhat-operators-2qnpd\" (UID: \"e305dff4-0550-4137-8fab-c84f5f0b59db\") " pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.471469 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vnvk\" (UniqueName: \"kubernetes.io/projected/e305dff4-0550-4137-8fab-c84f5f0b59db-kube-api-access-4vnvk\") pod \"redhat-operators-2qnpd\" (UID: \"e305dff4-0550-4137-8fab-c84f5f0b59db\") " pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.472339 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e305dff4-0550-4137-8fab-c84f5f0b59db-catalog-content\") pod \"redhat-operators-2qnpd\" (UID: \"e305dff4-0550-4137-8fab-c84f5f0b59db\") " pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.472388 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e305dff4-0550-4137-8fab-c84f5f0b59db-utilities\") pod \"redhat-operators-2qnpd\" (UID: \"e305dff4-0550-4137-8fab-c84f5f0b59db\") " pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.492188 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vnvk\" (UniqueName: \"kubernetes.io/projected/e305dff4-0550-4137-8fab-c84f5f0b59db-kube-api-access-4vnvk\") pod \"redhat-operators-2qnpd\" (UID: \"e305dff4-0550-4137-8fab-c84f5f0b59db\") " pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.510181 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:23:44 crc kubenswrapper[4975]: I0318 13:23:44.972427 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2qnpd"] Mar 18 13:23:46 crc kubenswrapper[4975]: I0318 13:23:46.099034 4975 generic.go:334] "Generic (PLEG): container finished" podID="e305dff4-0550-4137-8fab-c84f5f0b59db" containerID="6c0eecbc4324c34619fb2a28497a83a707504af6150cf1d263169102ae15469d" exitCode=0 Mar 18 13:23:46 crc kubenswrapper[4975]: I0318 13:23:46.099117 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qnpd" event={"ID":"e305dff4-0550-4137-8fab-c84f5f0b59db","Type":"ContainerDied","Data":"6c0eecbc4324c34619fb2a28497a83a707504af6150cf1d263169102ae15469d"} Mar 18 13:23:46 crc kubenswrapper[4975]: I0318 13:23:46.099315 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qnpd" event={"ID":"e305dff4-0550-4137-8fab-c84f5f0b59db","Type":"ContainerStarted","Data":"6d196fbcbcdbd434ad0a00e8af26339f09c975f41a260146fa6c4bf8d16939ae"} Mar 18 13:23:49 crc kubenswrapper[4975]: I0318 13:23:49.129078 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qnpd" event={"ID":"e305dff4-0550-4137-8fab-c84f5f0b59db","Type":"ContainerStarted","Data":"37ef2cd3b5d9a8fb8b3213c488cc405b2ff6084e22107f31ead3cffebffa8a03"} Mar 18 13:23:50 crc kubenswrapper[4975]: I0318 13:23:50.147898 4975 generic.go:334] "Generic (PLEG): container finished" podID="e305dff4-0550-4137-8fab-c84f5f0b59db" containerID="37ef2cd3b5d9a8fb8b3213c488cc405b2ff6084e22107f31ead3cffebffa8a03" exitCode=0 Mar 18 13:23:50 crc kubenswrapper[4975]: I0318 13:23:50.148040 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qnpd" event={"ID":"e305dff4-0550-4137-8fab-c84f5f0b59db","Type":"ContainerDied","Data":"37ef2cd3b5d9a8fb8b3213c488cc405b2ff6084e22107f31ead3cffebffa8a03"} Mar 18 13:23:51 crc kubenswrapper[4975]: I0318 13:23:51.161979 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qnpd" event={"ID":"e305dff4-0550-4137-8fab-c84f5f0b59db","Type":"ContainerStarted","Data":"d1ca5a81345f0ffdb13fee45afd4f2bbfa67178927250c3c885b10a2117ac15e"} Mar 18 13:23:51 crc kubenswrapper[4975]: I0318 13:23:51.186188 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2qnpd" podStartSLOduration=2.58137696 podStartE2EDuration="7.186139988s" podCreationTimestamp="2026-03-18 13:23:44 +0000 UTC" firstStartedPulling="2026-03-18 13:23:46.103098842 +0000 UTC m=+4411.817499441" lastFinishedPulling="2026-03-18 13:23:50.70786188 +0000 UTC m=+4416.422262469" observedRunningTime="2026-03-18 13:23:51.17917911 +0000 UTC m=+4416.893579699" watchObservedRunningTime="2026-03-18 13:23:51.186139988 +0000 UTC m=+4416.900540587" Mar 18 13:23:53 crc kubenswrapper[4975]: I0318 13:23:53.016613 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:23:53 crc kubenswrapper[4975]: E0318 13:23:53.017168 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:23:54 crc kubenswrapper[4975]: I0318 13:23:54.510698 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:23:54 crc kubenswrapper[4975]: I0318 13:23:54.510814 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:23:55 crc kubenswrapper[4975]: I0318 13:23:55.562745 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2qnpd" podUID="e305dff4-0550-4137-8fab-c84f5f0b59db" containerName="registry-server" probeResult="failure" output=< Mar 18 13:23:55 crc kubenswrapper[4975]: timeout: failed to connect service ":50051" within 1s Mar 18 13:23:55 crc kubenswrapper[4975]: > Mar 18 13:24:00 crc kubenswrapper[4975]: I0318 13:24:00.147889 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564004-xkcwz"] Mar 18 13:24:00 crc kubenswrapper[4975]: I0318 13:24:00.149496 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-xkcwz" Mar 18 13:24:00 crc kubenswrapper[4975]: I0318 13:24:00.153610 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:24:00 crc kubenswrapper[4975]: I0318 13:24:00.153610 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:24:00 crc kubenswrapper[4975]: I0318 13:24:00.154116 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:24:00 crc kubenswrapper[4975]: I0318 13:24:00.162470 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-xkcwz"] Mar 18 13:24:00 crc kubenswrapper[4975]: I0318 13:24:00.288839 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4jws\" (UniqueName: \"kubernetes.io/projected/731b5381-5b69-411d-accf-8b82fc97a709-kube-api-access-s4jws\") pod \"auto-csr-approver-29564004-xkcwz\" (UID: \"731b5381-5b69-411d-accf-8b82fc97a709\") " pod="openshift-infra/auto-csr-approver-29564004-xkcwz" Mar 18 13:24:00 crc kubenswrapper[4975]: I0318 13:24:00.391228 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4jws\" (UniqueName: \"kubernetes.io/projected/731b5381-5b69-411d-accf-8b82fc97a709-kube-api-access-s4jws\") pod \"auto-csr-approver-29564004-xkcwz\" (UID: \"731b5381-5b69-411d-accf-8b82fc97a709\") " pod="openshift-infra/auto-csr-approver-29564004-xkcwz" Mar 18 13:24:00 crc kubenswrapper[4975]: I0318 13:24:00.415745 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4jws\" (UniqueName: \"kubernetes.io/projected/731b5381-5b69-411d-accf-8b82fc97a709-kube-api-access-s4jws\") pod \"auto-csr-approver-29564004-xkcwz\" (UID: \"731b5381-5b69-411d-accf-8b82fc97a709\") " pod="openshift-infra/auto-csr-approver-29564004-xkcwz" Mar 18 13:24:00 crc kubenswrapper[4975]: I0318 13:24:00.469217 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-xkcwz" Mar 18 13:24:00 crc kubenswrapper[4975]: I0318 13:24:00.929916 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-xkcwz"] Mar 18 13:24:01 crc kubenswrapper[4975]: I0318 13:24:01.282611 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564004-xkcwz" event={"ID":"731b5381-5b69-411d-accf-8b82fc97a709","Type":"ContainerStarted","Data":"95f3cd3999de5dacd1bc92fcee2dd996b79fcbb865e83c4507c8e751663dabd3"} Mar 18 13:24:02 crc kubenswrapper[4975]: I0318 13:24:02.294333 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564004-xkcwz" event={"ID":"731b5381-5b69-411d-accf-8b82fc97a709","Type":"ContainerStarted","Data":"d1c5b3388001ed75011d071d72480c6eef23a9a6d93e1d8d5e34686a5f9c141b"} Mar 18 13:24:02 crc kubenswrapper[4975]: I0318 13:24:02.313359 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564004-xkcwz" podStartSLOduration=1.2148250919999999 podStartE2EDuration="2.313341804s" podCreationTimestamp="2026-03-18 13:24:00 +0000 UTC" firstStartedPulling="2026-03-18 13:24:00.937437732 +0000 UTC m=+4426.651838321" lastFinishedPulling="2026-03-18 13:24:02.035954454 +0000 UTC m=+4427.750355033" observedRunningTime="2026-03-18 13:24:02.308578215 +0000 UTC m=+4428.022978794" watchObservedRunningTime="2026-03-18 13:24:02.313341804 +0000 UTC m=+4428.027742383" Mar 18 13:24:03 crc kubenswrapper[4975]: I0318 13:24:03.306164 4975 generic.go:334] "Generic (PLEG): container finished" podID="731b5381-5b69-411d-accf-8b82fc97a709" containerID="d1c5b3388001ed75011d071d72480c6eef23a9a6d93e1d8d5e34686a5f9c141b" exitCode=0 Mar 18 13:24:03 crc kubenswrapper[4975]: I0318 13:24:03.306245 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564004-xkcwz" event={"ID":"731b5381-5b69-411d-accf-8b82fc97a709","Type":"ContainerDied","Data":"d1c5b3388001ed75011d071d72480c6eef23a9a6d93e1d8d5e34686a5f9c141b"} Mar 18 13:24:04 crc kubenswrapper[4975]: I0318 13:24:04.726394 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:24:04 crc kubenswrapper[4975]: I0318 13:24:04.775757 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-xkcwz" Mar 18 13:24:04 crc kubenswrapper[4975]: I0318 13:24:04.779641 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4jws\" (UniqueName: \"kubernetes.io/projected/731b5381-5b69-411d-accf-8b82fc97a709-kube-api-access-s4jws\") pod \"731b5381-5b69-411d-accf-8b82fc97a709\" (UID: \"731b5381-5b69-411d-accf-8b82fc97a709\") " Mar 18 13:24:04 crc kubenswrapper[4975]: I0318 13:24:04.785977 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:24:04 crc kubenswrapper[4975]: I0318 13:24:04.786643 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/731b5381-5b69-411d-accf-8b82fc97a709-kube-api-access-s4jws" (OuterVolumeSpecName: "kube-api-access-s4jws") pod "731b5381-5b69-411d-accf-8b82fc97a709" (UID: "731b5381-5b69-411d-accf-8b82fc97a709"). InnerVolumeSpecName "kube-api-access-s4jws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:04 crc kubenswrapper[4975]: I0318 13:24:04.882329 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4jws\" (UniqueName: \"kubernetes.io/projected/731b5381-5b69-411d-accf-8b82fc97a709-kube-api-access-s4jws\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:04 crc kubenswrapper[4975]: I0318 13:24:04.964385 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2qnpd"] Mar 18 13:24:05 crc kubenswrapper[4975]: I0318 13:24:05.025066 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:24:05 crc kubenswrapper[4975]: E0318 13:24:05.025373 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:24:05 crc kubenswrapper[4975]: I0318 13:24:05.324118 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564004-xkcwz" event={"ID":"731b5381-5b69-411d-accf-8b82fc97a709","Type":"ContainerDied","Data":"95f3cd3999de5dacd1bc92fcee2dd996b79fcbb865e83c4507c8e751663dabd3"} Mar 18 13:24:05 crc kubenswrapper[4975]: I0318 13:24:05.324536 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95f3cd3999de5dacd1bc92fcee2dd996b79fcbb865e83c4507c8e751663dabd3" Mar 18 13:24:05 crc kubenswrapper[4975]: I0318 13:24:05.324177 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564004-xkcwz" Mar 18 13:24:05 crc kubenswrapper[4975]: I0318 13:24:05.391222 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-ngkj4"] Mar 18 13:24:05 crc kubenswrapper[4975]: I0318 13:24:05.400545 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563998-ngkj4"] Mar 18 13:24:06 crc kubenswrapper[4975]: I0318 13:24:06.332061 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2qnpd" podUID="e305dff4-0550-4137-8fab-c84f5f0b59db" containerName="registry-server" containerID="cri-o://d1ca5a81345f0ffdb13fee45afd4f2bbfa67178927250c3c885b10a2117ac15e" gracePeriod=2 Mar 18 13:24:06 crc kubenswrapper[4975]: I0318 13:24:06.894629 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.022400 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e305dff4-0550-4137-8fab-c84f5f0b59db-utilities\") pod \"e305dff4-0550-4137-8fab-c84f5f0b59db\" (UID: \"e305dff4-0550-4137-8fab-c84f5f0b59db\") " Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.022529 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e305dff4-0550-4137-8fab-c84f5f0b59db-catalog-content\") pod \"e305dff4-0550-4137-8fab-c84f5f0b59db\" (UID: \"e305dff4-0550-4137-8fab-c84f5f0b59db\") " Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.022768 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vnvk\" (UniqueName: \"kubernetes.io/projected/e305dff4-0550-4137-8fab-c84f5f0b59db-kube-api-access-4vnvk\") pod \"e305dff4-0550-4137-8fab-c84f5f0b59db\" (UID: \"e305dff4-0550-4137-8fab-c84f5f0b59db\") " Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.023403 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e305dff4-0550-4137-8fab-c84f5f0b59db-utilities" (OuterVolumeSpecName: "utilities") pod "e305dff4-0550-4137-8fab-c84f5f0b59db" (UID: "e305dff4-0550-4137-8fab-c84f5f0b59db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.028817 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad5383b1-c31d-4a26-b102-9a808885fb6a" path="/var/lib/kubelet/pods/ad5383b1-c31d-4a26-b102-9a808885fb6a/volumes" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.062069 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e305dff4-0550-4137-8fab-c84f5f0b59db-kube-api-access-4vnvk" (OuterVolumeSpecName: "kube-api-access-4vnvk") pod "e305dff4-0550-4137-8fab-c84f5f0b59db" (UID: "e305dff4-0550-4137-8fab-c84f5f0b59db"). InnerVolumeSpecName "kube-api-access-4vnvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.125044 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vnvk\" (UniqueName: \"kubernetes.io/projected/e305dff4-0550-4137-8fab-c84f5f0b59db-kube-api-access-4vnvk\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.125102 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e305dff4-0550-4137-8fab-c84f5f0b59db-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.167062 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e305dff4-0550-4137-8fab-c84f5f0b59db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e305dff4-0550-4137-8fab-c84f5f0b59db" (UID: "e305dff4-0550-4137-8fab-c84f5f0b59db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.227777 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e305dff4-0550-4137-8fab-c84f5f0b59db-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.347065 4975 generic.go:334] "Generic (PLEG): container finished" podID="e305dff4-0550-4137-8fab-c84f5f0b59db" containerID="d1ca5a81345f0ffdb13fee45afd4f2bbfa67178927250c3c885b10a2117ac15e" exitCode=0 Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.347130 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qnpd" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.347128 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qnpd" event={"ID":"e305dff4-0550-4137-8fab-c84f5f0b59db","Type":"ContainerDied","Data":"d1ca5a81345f0ffdb13fee45afd4f2bbfa67178927250c3c885b10a2117ac15e"} Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.347265 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qnpd" event={"ID":"e305dff4-0550-4137-8fab-c84f5f0b59db","Type":"ContainerDied","Data":"6d196fbcbcdbd434ad0a00e8af26339f09c975f41a260146fa6c4bf8d16939ae"} Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.347287 4975 scope.go:117] "RemoveContainer" containerID="d1ca5a81345f0ffdb13fee45afd4f2bbfa67178927250c3c885b10a2117ac15e" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.375200 4975 scope.go:117] "RemoveContainer" containerID="37ef2cd3b5d9a8fb8b3213c488cc405b2ff6084e22107f31ead3cffebffa8a03" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.383991 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2qnpd"] Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.394470 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2qnpd"] Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.402270 4975 scope.go:117] "RemoveContainer" containerID="6c0eecbc4324c34619fb2a28497a83a707504af6150cf1d263169102ae15469d" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.439314 4975 scope.go:117] "RemoveContainer" containerID="d1ca5a81345f0ffdb13fee45afd4f2bbfa67178927250c3c885b10a2117ac15e" Mar 18 13:24:07 crc kubenswrapper[4975]: E0318 13:24:07.442360 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1ca5a81345f0ffdb13fee45afd4f2bbfa67178927250c3c885b10a2117ac15e\": container with ID starting with d1ca5a81345f0ffdb13fee45afd4f2bbfa67178927250c3c885b10a2117ac15e not found: ID does not exist" containerID="d1ca5a81345f0ffdb13fee45afd4f2bbfa67178927250c3c885b10a2117ac15e" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.442409 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ca5a81345f0ffdb13fee45afd4f2bbfa67178927250c3c885b10a2117ac15e"} err="failed to get container status \"d1ca5a81345f0ffdb13fee45afd4f2bbfa67178927250c3c885b10a2117ac15e\": rpc error: code = NotFound desc = could not find container \"d1ca5a81345f0ffdb13fee45afd4f2bbfa67178927250c3c885b10a2117ac15e\": container with ID starting with d1ca5a81345f0ffdb13fee45afd4f2bbfa67178927250c3c885b10a2117ac15e not found: ID does not exist" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.442439 4975 scope.go:117] "RemoveContainer" containerID="37ef2cd3b5d9a8fb8b3213c488cc405b2ff6084e22107f31ead3cffebffa8a03" Mar 18 13:24:07 crc kubenswrapper[4975]: E0318 13:24:07.443227 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37ef2cd3b5d9a8fb8b3213c488cc405b2ff6084e22107f31ead3cffebffa8a03\": container with ID starting with 37ef2cd3b5d9a8fb8b3213c488cc405b2ff6084e22107f31ead3cffebffa8a03 not found: ID does not exist" containerID="37ef2cd3b5d9a8fb8b3213c488cc405b2ff6084e22107f31ead3cffebffa8a03" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.443270 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ef2cd3b5d9a8fb8b3213c488cc405b2ff6084e22107f31ead3cffebffa8a03"} err="failed to get container status \"37ef2cd3b5d9a8fb8b3213c488cc405b2ff6084e22107f31ead3cffebffa8a03\": rpc error: code = NotFound desc = could not find container \"37ef2cd3b5d9a8fb8b3213c488cc405b2ff6084e22107f31ead3cffebffa8a03\": container with ID starting with 37ef2cd3b5d9a8fb8b3213c488cc405b2ff6084e22107f31ead3cffebffa8a03 not found: ID does not exist" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.443298 4975 scope.go:117] "RemoveContainer" containerID="6c0eecbc4324c34619fb2a28497a83a707504af6150cf1d263169102ae15469d" Mar 18 13:24:07 crc kubenswrapper[4975]: E0318 13:24:07.443565 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c0eecbc4324c34619fb2a28497a83a707504af6150cf1d263169102ae15469d\": container with ID starting with 6c0eecbc4324c34619fb2a28497a83a707504af6150cf1d263169102ae15469d not found: ID does not exist" containerID="6c0eecbc4324c34619fb2a28497a83a707504af6150cf1d263169102ae15469d" Mar 18 13:24:07 crc kubenswrapper[4975]: I0318 13:24:07.443585 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c0eecbc4324c34619fb2a28497a83a707504af6150cf1d263169102ae15469d"} err="failed to get container status \"6c0eecbc4324c34619fb2a28497a83a707504af6150cf1d263169102ae15469d\": rpc error: code = NotFound desc = could not find container \"6c0eecbc4324c34619fb2a28497a83a707504af6150cf1d263169102ae15469d\": container with ID starting with 6c0eecbc4324c34619fb2a28497a83a707504af6150cf1d263169102ae15469d not found: ID does not exist" Mar 18 13:24:09 crc kubenswrapper[4975]: I0318 13:24:09.027494 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e305dff4-0550-4137-8fab-c84f5f0b59db" path="/var/lib/kubelet/pods/e305dff4-0550-4137-8fab-c84f5f0b59db/volumes" Mar 18 13:24:19 crc kubenswrapper[4975]: I0318 13:24:19.018453 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:24:19 crc kubenswrapper[4975]: E0318 13:24:19.019230 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:24:32 crc kubenswrapper[4975]: I0318 13:24:32.016696 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:24:32 crc kubenswrapper[4975]: E0318 13:24:32.017458 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:24:45 crc kubenswrapper[4975]: I0318 13:24:45.022958 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:24:45 crc kubenswrapper[4975]: E0318 13:24:45.023597 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:24:49 crc kubenswrapper[4975]: I0318 13:24:49.739393 4975 scope.go:117] "RemoveContainer" containerID="96695f0b407dc7d00c6afe4eb65c3545954c7adc260d04a852b221fab8a2a2c9" Mar 18 13:24:56 crc kubenswrapper[4975]: I0318 13:24:56.016823 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:24:56 crc kubenswrapper[4975]: E0318 13:24:56.017742 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:25:08 crc kubenswrapper[4975]: I0318 13:25:08.016701 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:25:08 crc kubenswrapper[4975]: E0318 13:25:08.017562 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:25:23 crc kubenswrapper[4975]: I0318 13:25:23.016511 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:25:23 crc kubenswrapper[4975]: E0318 13:25:23.017479 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:25:36 crc kubenswrapper[4975]: I0318 13:25:36.016103 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:25:36 crc kubenswrapper[4975]: E0318 13:25:36.016825 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:25:51 crc kubenswrapper[4975]: I0318 13:25:51.017364 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:25:51 crc kubenswrapper[4975]: E0318 13:25:51.018549 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.167018 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564006-r9qgq"] Mar 18 13:26:00 crc kubenswrapper[4975]: E0318 13:26:00.168039 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e305dff4-0550-4137-8fab-c84f5f0b59db" containerName="registry-server" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.168055 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="e305dff4-0550-4137-8fab-c84f5f0b59db" containerName="registry-server" Mar 18 13:26:00 crc kubenswrapper[4975]: E0318 13:26:00.168076 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e305dff4-0550-4137-8fab-c84f5f0b59db" containerName="extract-utilities" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.168083 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="e305dff4-0550-4137-8fab-c84f5f0b59db" containerName="extract-utilities" Mar 18 13:26:00 crc kubenswrapper[4975]: E0318 13:26:00.168100 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e305dff4-0550-4137-8fab-c84f5f0b59db" containerName="extract-content" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.168106 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="e305dff4-0550-4137-8fab-c84f5f0b59db" containerName="extract-content" Mar 18 13:26:00 crc kubenswrapper[4975]: E0318 13:26:00.168118 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="731b5381-5b69-411d-accf-8b82fc97a709" containerName="oc" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.168123 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="731b5381-5b69-411d-accf-8b82fc97a709" containerName="oc" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.168327 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="731b5381-5b69-411d-accf-8b82fc97a709" containerName="oc" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.168343 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="e305dff4-0550-4137-8fab-c84f5f0b59db" containerName="registry-server" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.168994 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-r9qgq" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.171970 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.172306 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.172449 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.174787 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-r9qgq"] Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.309704 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5g72\" (UniqueName: \"kubernetes.io/projected/5095bc61-2289-4721-b5cc-318e897da6e0-kube-api-access-g5g72\") pod \"auto-csr-approver-29564006-r9qgq\" (UID: \"5095bc61-2289-4721-b5cc-318e897da6e0\") " pod="openshift-infra/auto-csr-approver-29564006-r9qgq" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.411266 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5g72\" (UniqueName: \"kubernetes.io/projected/5095bc61-2289-4721-b5cc-318e897da6e0-kube-api-access-g5g72\") pod \"auto-csr-approver-29564006-r9qgq\" (UID: \"5095bc61-2289-4721-b5cc-318e897da6e0\") " pod="openshift-infra/auto-csr-approver-29564006-r9qgq" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.432561 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5g72\" (UniqueName: \"kubernetes.io/projected/5095bc61-2289-4721-b5cc-318e897da6e0-kube-api-access-g5g72\") pod \"auto-csr-approver-29564006-r9qgq\" (UID: \"5095bc61-2289-4721-b5cc-318e897da6e0\") " pod="openshift-infra/auto-csr-approver-29564006-r9qgq" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.488126 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-r9qgq" Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.941409 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-r9qgq"] Mar 18 13:26:00 crc kubenswrapper[4975]: I0318 13:26:00.948487 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:26:01 crc kubenswrapper[4975]: I0318 13:26:01.355211 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564006-r9qgq" event={"ID":"5095bc61-2289-4721-b5cc-318e897da6e0","Type":"ContainerStarted","Data":"a11aed9108bb4a55a8add19be0e51265c212a5e12215a71e7ee9fc8d76271147"} Mar 18 13:26:02 crc kubenswrapper[4975]: I0318 13:26:02.017404 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:26:02 crc kubenswrapper[4975]: E0318 13:26:02.017645 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:26:03 crc kubenswrapper[4975]: I0318 13:26:03.373780 4975 generic.go:334] "Generic (PLEG): container finished" podID="5095bc61-2289-4721-b5cc-318e897da6e0" containerID="3ae85f7d5defcd8a370bbf0501aca571f98959bc9b03147e8294c152e556eab4" exitCode=0 Mar 18 13:26:03 crc kubenswrapper[4975]: I0318 13:26:03.373898 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564006-r9qgq" event={"ID":"5095bc61-2289-4721-b5cc-318e897da6e0","Type":"ContainerDied","Data":"3ae85f7d5defcd8a370bbf0501aca571f98959bc9b03147e8294c152e556eab4"} Mar 18 13:26:04 crc kubenswrapper[4975]: I0318 13:26:04.718112 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-r9qgq" Mar 18 13:26:04 crc kubenswrapper[4975]: I0318 13:26:04.806446 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5g72\" (UniqueName: \"kubernetes.io/projected/5095bc61-2289-4721-b5cc-318e897da6e0-kube-api-access-g5g72\") pod \"5095bc61-2289-4721-b5cc-318e897da6e0\" (UID: \"5095bc61-2289-4721-b5cc-318e897da6e0\") " Mar 18 13:26:04 crc kubenswrapper[4975]: I0318 13:26:04.811457 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5095bc61-2289-4721-b5cc-318e897da6e0-kube-api-access-g5g72" (OuterVolumeSpecName: "kube-api-access-g5g72") pod "5095bc61-2289-4721-b5cc-318e897da6e0" (UID: "5095bc61-2289-4721-b5cc-318e897da6e0"). InnerVolumeSpecName "kube-api-access-g5g72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:26:04 crc kubenswrapper[4975]: I0318 13:26:04.908740 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5g72\" (UniqueName: \"kubernetes.io/projected/5095bc61-2289-4721-b5cc-318e897da6e0-kube-api-access-g5g72\") on node \"crc\" DevicePath \"\"" Mar 18 13:26:05 crc kubenswrapper[4975]: I0318 13:26:05.393316 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564006-r9qgq" event={"ID":"5095bc61-2289-4721-b5cc-318e897da6e0","Type":"ContainerDied","Data":"a11aed9108bb4a55a8add19be0e51265c212a5e12215a71e7ee9fc8d76271147"} Mar 18 13:26:05 crc kubenswrapper[4975]: I0318 13:26:05.393715 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a11aed9108bb4a55a8add19be0e51265c212a5e12215a71e7ee9fc8d76271147" Mar 18 13:26:05 crc kubenswrapper[4975]: I0318 13:26:05.393447 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564006-r9qgq" Mar 18 13:26:05 crc kubenswrapper[4975]: I0318 13:26:05.796379 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-pcfw8"] Mar 18 13:26:05 crc kubenswrapper[4975]: I0318 13:26:05.807968 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564000-pcfw8"] Mar 18 13:26:07 crc kubenswrapper[4975]: I0318 13:26:07.028436 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86b9680b-512d-497d-9b53-366daa112aa3" path="/var/lib/kubelet/pods/86b9680b-512d-497d-9b53-366daa112aa3/volumes" Mar 18 13:26:16 crc kubenswrapper[4975]: I0318 13:26:16.016801 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:26:16 crc kubenswrapper[4975]: E0318 13:26:16.017613 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:26:31 crc kubenswrapper[4975]: I0318 13:26:31.017501 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:26:31 crc kubenswrapper[4975]: E0318 13:26:31.018186 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:26:43 crc kubenswrapper[4975]: I0318 13:26:43.018195 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:26:43 crc kubenswrapper[4975]: E0318 13:26:43.019710 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:26:49 crc kubenswrapper[4975]: I0318 13:26:49.847027 4975 scope.go:117] "RemoveContainer" containerID="005f42abe3c2d6c8cea7de7fe74c6adf6de64e00397b46f167d2ee4f01d7cd68" Mar 18 13:26:58 crc kubenswrapper[4975]: I0318 13:26:58.016886 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:26:58 crc kubenswrapper[4975]: E0318 13:26:58.017615 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:27:09 crc kubenswrapper[4975]: I0318 13:27:09.016464 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:27:09 crc kubenswrapper[4975]: E0318 13:27:09.017318 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:27:23 crc kubenswrapper[4975]: I0318 13:27:23.016483 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:27:23 crc kubenswrapper[4975]: E0318 13:27:23.017239 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:27:34 crc kubenswrapper[4975]: I0318 13:27:34.016712 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:27:34 crc kubenswrapper[4975]: I0318 13:27:34.251278 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"2cf67ccc567d76aff7b36e5a10d5c9daae556eef49e5ae8a0c6bb48c935569fb"} Mar 18 13:28:00 crc kubenswrapper[4975]: I0318 13:28:00.153757 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564008-svwx5"] Mar 18 13:28:00 crc kubenswrapper[4975]: E0318 13:28:00.156795 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5095bc61-2289-4721-b5cc-318e897da6e0" containerName="oc" Mar 18 13:28:00 crc kubenswrapper[4975]: I0318 13:28:00.156831 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="5095bc61-2289-4721-b5cc-318e897da6e0" containerName="oc" Mar 18 13:28:00 crc kubenswrapper[4975]: I0318 13:28:00.157126 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="5095bc61-2289-4721-b5cc-318e897da6e0" containerName="oc" Mar 18 13:28:00 crc kubenswrapper[4975]: I0318 13:28:00.158393 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-svwx5" Mar 18 13:28:00 crc kubenswrapper[4975]: I0318 13:28:00.162274 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:28:00 crc kubenswrapper[4975]: I0318 13:28:00.162325 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:28:00 crc kubenswrapper[4975]: I0318 13:28:00.162433 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:28:00 crc kubenswrapper[4975]: I0318 13:28:00.166005 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-svwx5"] Mar 18 13:28:00 crc kubenswrapper[4975]: I0318 13:28:00.238252 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4xwm\" (UniqueName: \"kubernetes.io/projected/cf5c26ed-e766-4702-9958-69b7d29af32f-kube-api-access-d4xwm\") pod \"auto-csr-approver-29564008-svwx5\" (UID: \"cf5c26ed-e766-4702-9958-69b7d29af32f\") " pod="openshift-infra/auto-csr-approver-29564008-svwx5" Mar 18 13:28:00 crc kubenswrapper[4975]: I0318 13:28:00.341347 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4xwm\" (UniqueName: \"kubernetes.io/projected/cf5c26ed-e766-4702-9958-69b7d29af32f-kube-api-access-d4xwm\") pod \"auto-csr-approver-29564008-svwx5\" (UID: \"cf5c26ed-e766-4702-9958-69b7d29af32f\") " pod="openshift-infra/auto-csr-approver-29564008-svwx5" Mar 18 13:28:00 crc kubenswrapper[4975]: I0318 13:28:00.366481 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4xwm\" (UniqueName: \"kubernetes.io/projected/cf5c26ed-e766-4702-9958-69b7d29af32f-kube-api-access-d4xwm\") pod \"auto-csr-approver-29564008-svwx5\" (UID: \"cf5c26ed-e766-4702-9958-69b7d29af32f\") " pod="openshift-infra/auto-csr-approver-29564008-svwx5" Mar 18 13:28:00 crc kubenswrapper[4975]: I0318 13:28:00.482995 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-svwx5" Mar 18 13:28:00 crc kubenswrapper[4975]: I0318 13:28:00.958323 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-svwx5"] Mar 18 13:28:01 crc kubenswrapper[4975]: I0318 13:28:01.500992 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564008-svwx5" event={"ID":"cf5c26ed-e766-4702-9958-69b7d29af32f","Type":"ContainerStarted","Data":"7f0c766189e721c9ee1c2d1a59734d153f5ce6c505fb92f43fd78bce71532686"} Mar 18 13:28:03 crc kubenswrapper[4975]: I0318 13:28:03.520165 4975 generic.go:334] "Generic (PLEG): container finished" podID="cf5c26ed-e766-4702-9958-69b7d29af32f" containerID="cd987e02a90893894eaeb1af0cfcc7a0c8f674153c1d3e5a131ef8ceb75215b7" exitCode=0 Mar 18 13:28:03 crc kubenswrapper[4975]: I0318 13:28:03.520672 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564008-svwx5" event={"ID":"cf5c26ed-e766-4702-9958-69b7d29af32f","Type":"ContainerDied","Data":"cd987e02a90893894eaeb1af0cfcc7a0c8f674153c1d3e5a131ef8ceb75215b7"} Mar 18 13:28:04 crc kubenswrapper[4975]: I0318 13:28:04.963452 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-svwx5" Mar 18 13:28:05 crc kubenswrapper[4975]: I0318 13:28:05.140530 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4xwm\" (UniqueName: \"kubernetes.io/projected/cf5c26ed-e766-4702-9958-69b7d29af32f-kube-api-access-d4xwm\") pod \"cf5c26ed-e766-4702-9958-69b7d29af32f\" (UID: \"cf5c26ed-e766-4702-9958-69b7d29af32f\") " Mar 18 13:28:05 crc kubenswrapper[4975]: I0318 13:28:05.151056 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf5c26ed-e766-4702-9958-69b7d29af32f-kube-api-access-d4xwm" (OuterVolumeSpecName: "kube-api-access-d4xwm") pod "cf5c26ed-e766-4702-9958-69b7d29af32f" (UID: "cf5c26ed-e766-4702-9958-69b7d29af32f"). InnerVolumeSpecName "kube-api-access-d4xwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:28:05 crc kubenswrapper[4975]: I0318 13:28:05.243389 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4xwm\" (UniqueName: \"kubernetes.io/projected/cf5c26ed-e766-4702-9958-69b7d29af32f-kube-api-access-d4xwm\") on node \"crc\" DevicePath \"\"" Mar 18 13:28:05 crc kubenswrapper[4975]: I0318 13:28:05.540987 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564008-svwx5" Mar 18 13:28:05 crc kubenswrapper[4975]: I0318 13:28:05.541466 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564008-svwx5" event={"ID":"cf5c26ed-e766-4702-9958-69b7d29af32f","Type":"ContainerDied","Data":"7f0c766189e721c9ee1c2d1a59734d153f5ce6c505fb92f43fd78bce71532686"} Mar 18 13:28:05 crc kubenswrapper[4975]: I0318 13:28:05.541551 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f0c766189e721c9ee1c2d1a59734d153f5ce6c505fb92f43fd78bce71532686" Mar 18 13:28:06 crc kubenswrapper[4975]: I0318 13:28:06.032711 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-7k22c"] Mar 18 13:28:06 crc kubenswrapper[4975]: I0318 13:28:06.041825 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564002-7k22c"] Mar 18 13:28:07 crc kubenswrapper[4975]: I0318 13:28:07.029693 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e8a646-a265-44f2-80b2-1d37edcb50b7" path="/var/lib/kubelet/pods/a5e8a646-a265-44f2-80b2-1d37edcb50b7/volumes" Mar 18 13:28:49 crc kubenswrapper[4975]: I0318 13:28:49.938970 4975 scope.go:117] "RemoveContainer" containerID="5fac66a63aefb4e57083d7aa4ad774593d6976eec2976ab2804bec85e12e0bf9" Mar 18 13:29:55 crc kubenswrapper[4975]: I0318 13:29:55.539315 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:29:55 crc kubenswrapper[4975]: I0318 13:29:55.540121 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.152145 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564010-2q2xg"] Mar 18 13:30:00 crc kubenswrapper[4975]: E0318 13:30:00.152819 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5c26ed-e766-4702-9958-69b7d29af32f" containerName="oc" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.152832 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5c26ed-e766-4702-9958-69b7d29af32f" containerName="oc" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.153070 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf5c26ed-e766-4702-9958-69b7d29af32f" containerName="oc" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.153753 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-2q2xg" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.156753 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.156803 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.157069 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.165261 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv"] Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.166569 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.168496 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.170212 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.177278 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv"] Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.188028 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-2q2xg"] Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.342003 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5b37595-b208-45ee-a78c-be56a833e5f8-secret-volume\") pod \"collect-profiles-29564010-2k2wv\" (UID: \"f5b37595-b208-45ee-a78c-be56a833e5f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.342065 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x5k7\" (UniqueName: \"kubernetes.io/projected/f5b37595-b208-45ee-a78c-be56a833e5f8-kube-api-access-7x5k7\") pod \"collect-profiles-29564010-2k2wv\" (UID: \"f5b37595-b208-45ee-a78c-be56a833e5f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.342940 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfhbf\" (UniqueName: \"kubernetes.io/projected/da209d13-af65-4c07-8823-fcce63dc995a-kube-api-access-xfhbf\") pod \"auto-csr-approver-29564010-2q2xg\" (UID: \"da209d13-af65-4c07-8823-fcce63dc995a\") " pod="openshift-infra/auto-csr-approver-29564010-2q2xg" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.343077 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5b37595-b208-45ee-a78c-be56a833e5f8-config-volume\") pod \"collect-profiles-29564010-2k2wv\" (UID: \"f5b37595-b208-45ee-a78c-be56a833e5f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.444468 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfhbf\" (UniqueName: \"kubernetes.io/projected/da209d13-af65-4c07-8823-fcce63dc995a-kube-api-access-xfhbf\") pod \"auto-csr-approver-29564010-2q2xg\" (UID: \"da209d13-af65-4c07-8823-fcce63dc995a\") " pod="openshift-infra/auto-csr-approver-29564010-2q2xg" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.444526 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5b37595-b208-45ee-a78c-be56a833e5f8-config-volume\") pod \"collect-profiles-29564010-2k2wv\" (UID: \"f5b37595-b208-45ee-a78c-be56a833e5f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.444571 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5b37595-b208-45ee-a78c-be56a833e5f8-secret-volume\") pod \"collect-profiles-29564010-2k2wv\" (UID: \"f5b37595-b208-45ee-a78c-be56a833e5f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.444603 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x5k7\" (UniqueName: \"kubernetes.io/projected/f5b37595-b208-45ee-a78c-be56a833e5f8-kube-api-access-7x5k7\") pod \"collect-profiles-29564010-2k2wv\" (UID: \"f5b37595-b208-45ee-a78c-be56a833e5f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.445851 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5b37595-b208-45ee-a78c-be56a833e5f8-config-volume\") pod \"collect-profiles-29564010-2k2wv\" (UID: \"f5b37595-b208-45ee-a78c-be56a833e5f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.451622 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5b37595-b208-45ee-a78c-be56a833e5f8-secret-volume\") pod \"collect-profiles-29564010-2k2wv\" (UID: \"f5b37595-b208-45ee-a78c-be56a833e5f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.462391 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfhbf\" (UniqueName: \"kubernetes.io/projected/da209d13-af65-4c07-8823-fcce63dc995a-kube-api-access-xfhbf\") pod \"auto-csr-approver-29564010-2q2xg\" (UID: \"da209d13-af65-4c07-8823-fcce63dc995a\") " pod="openshift-infra/auto-csr-approver-29564010-2q2xg" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.463249 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x5k7\" (UniqueName: \"kubernetes.io/projected/f5b37595-b208-45ee-a78c-be56a833e5f8-kube-api-access-7x5k7\") pod \"collect-profiles-29564010-2k2wv\" (UID: \"f5b37595-b208-45ee-a78c-be56a833e5f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.474143 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-2q2xg" Mar 18 13:30:00 crc kubenswrapper[4975]: I0318 13:30:00.491272 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" Mar 18 13:30:01 crc kubenswrapper[4975]: I0318 13:30:01.089376 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-2q2xg"] Mar 18 13:30:01 crc kubenswrapper[4975]: I0318 13:30:01.196398 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv"] Mar 18 13:30:01 crc kubenswrapper[4975]: W0318 13:30:01.201630 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5b37595_b208_45ee_a78c_be56a833e5f8.slice/crio-1ae5b3b2c6cb7834097371fbc02c5184f125c26a2283973bd6d6780da035e58c WatchSource:0}: Error finding container 1ae5b3b2c6cb7834097371fbc02c5184f125c26a2283973bd6d6780da035e58c: Status 404 returned error can't find the container with id 1ae5b3b2c6cb7834097371fbc02c5184f125c26a2283973bd6d6780da035e58c Mar 18 13:30:01 crc kubenswrapper[4975]: I0318 13:30:01.587080 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564010-2q2xg" event={"ID":"da209d13-af65-4c07-8823-fcce63dc995a","Type":"ContainerStarted","Data":"93437449d6aa317f69002c1a5187784e61766227db3a16ee2377329103b23462"} Mar 18 13:30:01 crc kubenswrapper[4975]: I0318 13:30:01.589063 4975 generic.go:334] "Generic (PLEG): container finished" podID="f5b37595-b208-45ee-a78c-be56a833e5f8" containerID="5c5adb5306591922fe5bf36d83fd2609e79cbf4754efe777d6ba3e58e5c1adc0" exitCode=0 Mar 18 13:30:01 crc kubenswrapper[4975]: I0318 13:30:01.589088 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" event={"ID":"f5b37595-b208-45ee-a78c-be56a833e5f8","Type":"ContainerDied","Data":"5c5adb5306591922fe5bf36d83fd2609e79cbf4754efe777d6ba3e58e5c1adc0"} Mar 18 13:30:01 crc kubenswrapper[4975]: I0318 13:30:01.589102 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" event={"ID":"f5b37595-b208-45ee-a78c-be56a833e5f8","Type":"ContainerStarted","Data":"1ae5b3b2c6cb7834097371fbc02c5184f125c26a2283973bd6d6780da035e58c"} Mar 18 13:30:02 crc kubenswrapper[4975]: I0318 13:30:02.923690 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" Mar 18 13:30:03 crc kubenswrapper[4975]: I0318 13:30:03.088511 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5b37595-b208-45ee-a78c-be56a833e5f8-secret-volume\") pod \"f5b37595-b208-45ee-a78c-be56a833e5f8\" (UID: \"f5b37595-b208-45ee-a78c-be56a833e5f8\") " Mar 18 13:30:03 crc kubenswrapper[4975]: I0318 13:30:03.088636 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x5k7\" (UniqueName: \"kubernetes.io/projected/f5b37595-b208-45ee-a78c-be56a833e5f8-kube-api-access-7x5k7\") pod \"f5b37595-b208-45ee-a78c-be56a833e5f8\" (UID: \"f5b37595-b208-45ee-a78c-be56a833e5f8\") " Mar 18 13:30:03 crc kubenswrapper[4975]: I0318 13:30:03.088785 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5b37595-b208-45ee-a78c-be56a833e5f8-config-volume\") pod \"f5b37595-b208-45ee-a78c-be56a833e5f8\" (UID: \"f5b37595-b208-45ee-a78c-be56a833e5f8\") " Mar 18 13:30:03 crc kubenswrapper[4975]: I0318 13:30:03.089902 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5b37595-b208-45ee-a78c-be56a833e5f8-config-volume" (OuterVolumeSpecName: "config-volume") pod "f5b37595-b208-45ee-a78c-be56a833e5f8" (UID: "f5b37595-b208-45ee-a78c-be56a833e5f8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:30:03 crc kubenswrapper[4975]: I0318 13:30:03.096125 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b37595-b208-45ee-a78c-be56a833e5f8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f5b37595-b208-45ee-a78c-be56a833e5f8" (UID: "f5b37595-b208-45ee-a78c-be56a833e5f8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:30:03 crc kubenswrapper[4975]: I0318 13:30:03.101167 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5b37595-b208-45ee-a78c-be56a833e5f8-kube-api-access-7x5k7" (OuterVolumeSpecName: "kube-api-access-7x5k7") pod "f5b37595-b208-45ee-a78c-be56a833e5f8" (UID: "f5b37595-b208-45ee-a78c-be56a833e5f8"). InnerVolumeSpecName "kube-api-access-7x5k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:03 crc kubenswrapper[4975]: I0318 13:30:03.192329 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x5k7\" (UniqueName: \"kubernetes.io/projected/f5b37595-b208-45ee-a78c-be56a833e5f8-kube-api-access-7x5k7\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:03 crc kubenswrapper[4975]: I0318 13:30:03.192369 4975 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5b37595-b208-45ee-a78c-be56a833e5f8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:03 crc kubenswrapper[4975]: I0318 13:30:03.192382 4975 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5b37595-b208-45ee-a78c-be56a833e5f8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:03 crc kubenswrapper[4975]: I0318 13:30:03.606697 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" event={"ID":"f5b37595-b208-45ee-a78c-be56a833e5f8","Type":"ContainerDied","Data":"1ae5b3b2c6cb7834097371fbc02c5184f125c26a2283973bd6d6780da035e58c"} Mar 18 13:30:03 crc kubenswrapper[4975]: I0318 13:30:03.606770 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564010-2k2wv" Mar 18 13:30:03 crc kubenswrapper[4975]: I0318 13:30:03.606811 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ae5b3b2c6cb7834097371fbc02c5184f125c26a2283973bd6d6780da035e58c" Mar 18 13:30:04 crc kubenswrapper[4975]: I0318 13:30:04.005289 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz"] Mar 18 13:30:04 crc kubenswrapper[4975]: I0318 13:30:04.014228 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563965-qlfwz"] Mar 18 13:30:05 crc kubenswrapper[4975]: I0318 13:30:05.029189 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3c91b58-13e4-4d0a-93e8-0ec185187b3a" path="/var/lib/kubelet/pods/f3c91b58-13e4-4d0a-93e8-0ec185187b3a/volumes" Mar 18 13:30:05 crc kubenswrapper[4975]: I0318 13:30:05.623959 4975 generic.go:334] "Generic (PLEG): container finished" podID="da209d13-af65-4c07-8823-fcce63dc995a" containerID="9c8f6fe8823b8122357b312374376a51162d499f12728901d20dcef9bc9e3690" exitCode=0 Mar 18 13:30:05 crc kubenswrapper[4975]: I0318 13:30:05.624024 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564010-2q2xg" event={"ID":"da209d13-af65-4c07-8823-fcce63dc995a","Type":"ContainerDied","Data":"9c8f6fe8823b8122357b312374376a51162d499f12728901d20dcef9bc9e3690"} Mar 18 13:30:06 crc kubenswrapper[4975]: I0318 13:30:06.951967 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-2q2xg" Mar 18 13:30:07 crc kubenswrapper[4975]: I0318 13:30:07.058542 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfhbf\" (UniqueName: \"kubernetes.io/projected/da209d13-af65-4c07-8823-fcce63dc995a-kube-api-access-xfhbf\") pod \"da209d13-af65-4c07-8823-fcce63dc995a\" (UID: \"da209d13-af65-4c07-8823-fcce63dc995a\") " Mar 18 13:30:07 crc kubenswrapper[4975]: I0318 13:30:07.064209 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da209d13-af65-4c07-8823-fcce63dc995a-kube-api-access-xfhbf" (OuterVolumeSpecName: "kube-api-access-xfhbf") pod "da209d13-af65-4c07-8823-fcce63dc995a" (UID: "da209d13-af65-4c07-8823-fcce63dc995a"). InnerVolumeSpecName "kube-api-access-xfhbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:30:07 crc kubenswrapper[4975]: I0318 13:30:07.160569 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfhbf\" (UniqueName: \"kubernetes.io/projected/da209d13-af65-4c07-8823-fcce63dc995a-kube-api-access-xfhbf\") on node \"crc\" DevicePath \"\"" Mar 18 13:30:07 crc kubenswrapper[4975]: I0318 13:30:07.640288 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564010-2q2xg" event={"ID":"da209d13-af65-4c07-8823-fcce63dc995a","Type":"ContainerDied","Data":"93437449d6aa317f69002c1a5187784e61766227db3a16ee2377329103b23462"} Mar 18 13:30:07 crc kubenswrapper[4975]: I0318 13:30:07.640641 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93437449d6aa317f69002c1a5187784e61766227db3a16ee2377329103b23462" Mar 18 13:30:07 crc kubenswrapper[4975]: I0318 13:30:07.640341 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564010-2q2xg" Mar 18 13:30:08 crc kubenswrapper[4975]: I0318 13:30:08.008783 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-xkcwz"] Mar 18 13:30:08 crc kubenswrapper[4975]: I0318 13:30:08.018098 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564004-xkcwz"] Mar 18 13:30:09 crc kubenswrapper[4975]: I0318 13:30:09.030521 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="731b5381-5b69-411d-accf-8b82fc97a709" path="/var/lib/kubelet/pods/731b5381-5b69-411d-accf-8b82fc97a709/volumes" Mar 18 13:30:25 crc kubenswrapper[4975]: I0318 13:30:25.538945 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:30:25 crc kubenswrapper[4975]: I0318 13:30:25.539727 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:30:50 crc kubenswrapper[4975]: I0318 13:30:50.027229 4975 scope.go:117] "RemoveContainer" containerID="d1c5b3388001ed75011d071d72480c6eef23a9a6d93e1d8d5e34686a5f9c141b" Mar 18 13:30:50 crc kubenswrapper[4975]: I0318 13:30:50.069173 4975 scope.go:117] "RemoveContainer" containerID="39714d1784f3a47619d89f33efc5052d817d40dc0e83dfc0694859180108accd" Mar 18 13:30:55 crc kubenswrapper[4975]: I0318 13:30:55.538674 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:30:55 crc kubenswrapper[4975]: I0318 13:30:55.539109 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:30:55 crc kubenswrapper[4975]: I0318 13:30:55.539165 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 13:30:55 crc kubenswrapper[4975]: I0318 13:30:55.539979 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2cf67ccc567d76aff7b36e5a10d5c9daae556eef49e5ae8a0c6bb48c935569fb"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:30:55 crc kubenswrapper[4975]: I0318 13:30:55.540039 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://2cf67ccc567d76aff7b36e5a10d5c9daae556eef49e5ae8a0c6bb48c935569fb" gracePeriod=600 Mar 18 13:30:56 crc kubenswrapper[4975]: I0318 13:30:56.126016 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="2cf67ccc567d76aff7b36e5a10d5c9daae556eef49e5ae8a0c6bb48c935569fb" exitCode=0 Mar 18 13:30:56 crc kubenswrapper[4975]: I0318 13:30:56.126085 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"2cf67ccc567d76aff7b36e5a10d5c9daae556eef49e5ae8a0c6bb48c935569fb"} Mar 18 13:30:56 crc kubenswrapper[4975]: I0318 13:30:56.126724 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519"} Mar 18 13:30:56 crc kubenswrapper[4975]: I0318 13:30:56.126748 4975 scope.go:117] "RemoveContainer" containerID="d7b1b3a9b86c245c320d7dc73884ffacb0c8809cbd06da0c5d1d70043ed0eaea" Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.143792 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564012-czjk6"] Mar 18 13:32:00 crc kubenswrapper[4975]: E0318 13:32:00.144657 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b37595-b208-45ee-a78c-be56a833e5f8" containerName="collect-profiles" Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.144669 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b37595-b208-45ee-a78c-be56a833e5f8" containerName="collect-profiles" Mar 18 13:32:00 crc kubenswrapper[4975]: E0318 13:32:00.144694 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da209d13-af65-4c07-8823-fcce63dc995a" containerName="oc" Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.144700 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="da209d13-af65-4c07-8823-fcce63dc995a" containerName="oc" Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.144917 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="da209d13-af65-4c07-8823-fcce63dc995a" containerName="oc" Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.144945 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5b37595-b208-45ee-a78c-be56a833e5f8" containerName="collect-profiles" Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.145646 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-czjk6" Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.152028 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.152248 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.152549 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.162170 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-czjk6"] Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.332057 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qq6z\" (UniqueName: \"kubernetes.io/projected/e17f683f-43ee-45bf-9e94-5f550c2d8449-kube-api-access-4qq6z\") pod \"auto-csr-approver-29564012-czjk6\" (UID: \"e17f683f-43ee-45bf-9e94-5f550c2d8449\") " pod="openshift-infra/auto-csr-approver-29564012-czjk6" Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.434202 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qq6z\" (UniqueName: \"kubernetes.io/projected/e17f683f-43ee-45bf-9e94-5f550c2d8449-kube-api-access-4qq6z\") pod \"auto-csr-approver-29564012-czjk6\" (UID: \"e17f683f-43ee-45bf-9e94-5f550c2d8449\") " pod="openshift-infra/auto-csr-approver-29564012-czjk6" Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.454198 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qq6z\" (UniqueName: \"kubernetes.io/projected/e17f683f-43ee-45bf-9e94-5f550c2d8449-kube-api-access-4qq6z\") pod \"auto-csr-approver-29564012-czjk6\" (UID: \"e17f683f-43ee-45bf-9e94-5f550c2d8449\") " pod="openshift-infra/auto-csr-approver-29564012-czjk6" Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.467313 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-czjk6" Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.946210 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-czjk6"] Mar 18 13:32:00 crc kubenswrapper[4975]: I0318 13:32:00.956624 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:32:01 crc kubenswrapper[4975]: I0318 13:32:01.676064 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564012-czjk6" event={"ID":"e17f683f-43ee-45bf-9e94-5f550c2d8449","Type":"ContainerStarted","Data":"6ab3f689186c8a2369e474df701e1c0f34391f0338efa903415bb9fc9af6e012"} Mar 18 13:32:02 crc kubenswrapper[4975]: I0318 13:32:02.688051 4975 generic.go:334] "Generic (PLEG): container finished" podID="e17f683f-43ee-45bf-9e94-5f550c2d8449" containerID="0c0e58f0585a4ab9a189c65a28ffd44118e9fd1bd2ba9494af1f4b003c11cdca" exitCode=0 Mar 18 13:32:02 crc kubenswrapper[4975]: I0318 13:32:02.688114 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564012-czjk6" event={"ID":"e17f683f-43ee-45bf-9e94-5f550c2d8449","Type":"ContainerDied","Data":"0c0e58f0585a4ab9a189c65a28ffd44118e9fd1bd2ba9494af1f4b003c11cdca"} Mar 18 13:32:04 crc kubenswrapper[4975]: I0318 13:32:04.159759 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-czjk6" Mar 18 13:32:04 crc kubenswrapper[4975]: I0318 13:32:04.313022 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qq6z\" (UniqueName: \"kubernetes.io/projected/e17f683f-43ee-45bf-9e94-5f550c2d8449-kube-api-access-4qq6z\") pod \"e17f683f-43ee-45bf-9e94-5f550c2d8449\" (UID: \"e17f683f-43ee-45bf-9e94-5f550c2d8449\") " Mar 18 13:32:04 crc kubenswrapper[4975]: I0318 13:32:04.318796 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17f683f-43ee-45bf-9e94-5f550c2d8449-kube-api-access-4qq6z" (OuterVolumeSpecName: "kube-api-access-4qq6z") pod "e17f683f-43ee-45bf-9e94-5f550c2d8449" (UID: "e17f683f-43ee-45bf-9e94-5f550c2d8449"). InnerVolumeSpecName "kube-api-access-4qq6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:32:04 crc kubenswrapper[4975]: I0318 13:32:04.415133 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qq6z\" (UniqueName: \"kubernetes.io/projected/e17f683f-43ee-45bf-9e94-5f550c2d8449-kube-api-access-4qq6z\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:04 crc kubenswrapper[4975]: I0318 13:32:04.707583 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564012-czjk6" event={"ID":"e17f683f-43ee-45bf-9e94-5f550c2d8449","Type":"ContainerDied","Data":"6ab3f689186c8a2369e474df701e1c0f34391f0338efa903415bb9fc9af6e012"} Mar 18 13:32:04 crc kubenswrapper[4975]: I0318 13:32:04.707880 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ab3f689186c8a2369e474df701e1c0f34391f0338efa903415bb9fc9af6e012" Mar 18 13:32:04 crc kubenswrapper[4975]: I0318 13:32:04.707677 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564012-czjk6" Mar 18 13:32:05 crc kubenswrapper[4975]: I0318 13:32:05.238204 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-r9qgq"] Mar 18 13:32:05 crc kubenswrapper[4975]: I0318 13:32:05.246011 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564006-r9qgq"] Mar 18 13:32:07 crc kubenswrapper[4975]: I0318 13:32:07.028581 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5095bc61-2289-4721-b5cc-318e897da6e0" path="/var/lib/kubelet/pods/5095bc61-2289-4721-b5cc-318e897da6e0/volumes" Mar 18 13:32:22 crc kubenswrapper[4975]: I0318 13:32:22.520090 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v2kpb"] Mar 18 13:32:22 crc kubenswrapper[4975]: E0318 13:32:22.521128 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17f683f-43ee-45bf-9e94-5f550c2d8449" containerName="oc" Mar 18 13:32:22 crc kubenswrapper[4975]: I0318 13:32:22.521146 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17f683f-43ee-45bf-9e94-5f550c2d8449" containerName="oc" Mar 18 13:32:22 crc kubenswrapper[4975]: I0318 13:32:22.521386 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="e17f683f-43ee-45bf-9e94-5f550c2d8449" containerName="oc" Mar 18 13:32:22 crc kubenswrapper[4975]: I0318 13:32:22.523007 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:22 crc kubenswrapper[4975]: I0318 13:32:22.533189 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v2kpb"] Mar 18 13:32:22 crc kubenswrapper[4975]: I0318 13:32:22.645994 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26441577-4d0c-4124-a970-72aa6ae0648a-catalog-content\") pod \"community-operators-v2kpb\" (UID: \"26441577-4d0c-4124-a970-72aa6ae0648a\") " pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:22 crc kubenswrapper[4975]: I0318 13:32:22.646450 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26441577-4d0c-4124-a970-72aa6ae0648a-utilities\") pod \"community-operators-v2kpb\" (UID: \"26441577-4d0c-4124-a970-72aa6ae0648a\") " pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:22 crc kubenswrapper[4975]: I0318 13:32:22.646477 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxr2s\" (UniqueName: \"kubernetes.io/projected/26441577-4d0c-4124-a970-72aa6ae0648a-kube-api-access-kxr2s\") pod \"community-operators-v2kpb\" (UID: \"26441577-4d0c-4124-a970-72aa6ae0648a\") " pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:22 crc kubenswrapper[4975]: I0318 13:32:22.749253 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26441577-4d0c-4124-a970-72aa6ae0648a-catalog-content\") pod \"community-operators-v2kpb\" (UID: \"26441577-4d0c-4124-a970-72aa6ae0648a\") " pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:22 crc kubenswrapper[4975]: I0318 13:32:22.749360 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26441577-4d0c-4124-a970-72aa6ae0648a-utilities\") pod \"community-operators-v2kpb\" (UID: \"26441577-4d0c-4124-a970-72aa6ae0648a\") " pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:22 crc kubenswrapper[4975]: I0318 13:32:22.749388 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxr2s\" (UniqueName: \"kubernetes.io/projected/26441577-4d0c-4124-a970-72aa6ae0648a-kube-api-access-kxr2s\") pod \"community-operators-v2kpb\" (UID: \"26441577-4d0c-4124-a970-72aa6ae0648a\") " pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:22 crc kubenswrapper[4975]: I0318 13:32:22.749787 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26441577-4d0c-4124-a970-72aa6ae0648a-catalog-content\") pod \"community-operators-v2kpb\" (UID: \"26441577-4d0c-4124-a970-72aa6ae0648a\") " pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:22 crc kubenswrapper[4975]: I0318 13:32:22.750125 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26441577-4d0c-4124-a970-72aa6ae0648a-utilities\") pod \"community-operators-v2kpb\" (UID: \"26441577-4d0c-4124-a970-72aa6ae0648a\") " pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:23 crc kubenswrapper[4975]: I0318 13:32:23.175347 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxr2s\" (UniqueName: \"kubernetes.io/projected/26441577-4d0c-4124-a970-72aa6ae0648a-kube-api-access-kxr2s\") pod \"community-operators-v2kpb\" (UID: \"26441577-4d0c-4124-a970-72aa6ae0648a\") " pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:23 crc kubenswrapper[4975]: I0318 13:32:23.454445 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:23 crc kubenswrapper[4975]: I0318 13:32:23.951424 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v2kpb"] Mar 18 13:32:24 crc kubenswrapper[4975]: I0318 13:32:24.894755 4975 generic.go:334] "Generic (PLEG): container finished" podID="26441577-4d0c-4124-a970-72aa6ae0648a" containerID="9d810eeda237c15e6097a2b2d0b4f6cb6a7dd541153f8ca6a551729cac8002bf" exitCode=0 Mar 18 13:32:24 crc kubenswrapper[4975]: I0318 13:32:24.894793 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v2kpb" event={"ID":"26441577-4d0c-4124-a970-72aa6ae0648a","Type":"ContainerDied","Data":"9d810eeda237c15e6097a2b2d0b4f6cb6a7dd541153f8ca6a551729cac8002bf"} Mar 18 13:32:24 crc kubenswrapper[4975]: I0318 13:32:24.895413 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v2kpb" event={"ID":"26441577-4d0c-4124-a970-72aa6ae0648a","Type":"ContainerStarted","Data":"9839cab60d70a4299a6b2e6a4f21902de677c6713a15ea436cac015cb0580f08"} Mar 18 13:32:25 crc kubenswrapper[4975]: I0318 13:32:25.905213 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v2kpb" event={"ID":"26441577-4d0c-4124-a970-72aa6ae0648a","Type":"ContainerStarted","Data":"71872f45f12a5fe7a154abe7e3595ba5a41276824effc1074b86c7dd41740761"} Mar 18 13:32:26 crc kubenswrapper[4975]: I0318 13:32:26.914839 4975 generic.go:334] "Generic (PLEG): container finished" podID="26441577-4d0c-4124-a970-72aa6ae0648a" containerID="71872f45f12a5fe7a154abe7e3595ba5a41276824effc1074b86c7dd41740761" exitCode=0 Mar 18 13:32:26 crc kubenswrapper[4975]: I0318 13:32:26.914902 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v2kpb" event={"ID":"26441577-4d0c-4124-a970-72aa6ae0648a","Type":"ContainerDied","Data":"71872f45f12a5fe7a154abe7e3595ba5a41276824effc1074b86c7dd41740761"} Mar 18 13:32:27 crc kubenswrapper[4975]: I0318 13:32:27.951924 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v2kpb" event={"ID":"26441577-4d0c-4124-a970-72aa6ae0648a","Type":"ContainerStarted","Data":"047da252aedb2be24732f640dcb610e5fb765903079031e8f9ff660302810715"} Mar 18 13:32:27 crc kubenswrapper[4975]: I0318 13:32:27.979475 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v2kpb" podStartSLOduration=3.440688663 podStartE2EDuration="5.979423276s" podCreationTimestamp="2026-03-18 13:32:22 +0000 UTC" firstStartedPulling="2026-03-18 13:32:24.896622863 +0000 UTC m=+4930.611023442" lastFinishedPulling="2026-03-18 13:32:27.435357426 +0000 UTC m=+4933.149758055" observedRunningTime="2026-03-18 13:32:27.971379248 +0000 UTC m=+4933.685779827" watchObservedRunningTime="2026-03-18 13:32:27.979423276 +0000 UTC m=+4933.693823865" Mar 18 13:32:33 crc kubenswrapper[4975]: I0318 13:32:33.454775 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:33 crc kubenswrapper[4975]: I0318 13:32:33.455310 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:33 crc kubenswrapper[4975]: I0318 13:32:33.496333 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:34 crc kubenswrapper[4975]: I0318 13:32:34.301110 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:34 crc kubenswrapper[4975]: I0318 13:32:34.345269 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v2kpb"] Mar 18 13:32:36 crc kubenswrapper[4975]: I0318 13:32:36.013271 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v2kpb" podUID="26441577-4d0c-4124-a970-72aa6ae0648a" containerName="registry-server" containerID="cri-o://047da252aedb2be24732f640dcb610e5fb765903079031e8f9ff660302810715" gracePeriod=2 Mar 18 13:32:36 crc kubenswrapper[4975]: I0318 13:32:36.500793 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:36 crc kubenswrapper[4975]: I0318 13:32:36.607820 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26441577-4d0c-4124-a970-72aa6ae0648a-catalog-content\") pod \"26441577-4d0c-4124-a970-72aa6ae0648a\" (UID: \"26441577-4d0c-4124-a970-72aa6ae0648a\") " Mar 18 13:32:36 crc kubenswrapper[4975]: I0318 13:32:36.607900 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26441577-4d0c-4124-a970-72aa6ae0648a-utilities\") pod \"26441577-4d0c-4124-a970-72aa6ae0648a\" (UID: \"26441577-4d0c-4124-a970-72aa6ae0648a\") " Mar 18 13:32:36 crc kubenswrapper[4975]: I0318 13:32:36.608098 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxr2s\" (UniqueName: \"kubernetes.io/projected/26441577-4d0c-4124-a970-72aa6ae0648a-kube-api-access-kxr2s\") pod \"26441577-4d0c-4124-a970-72aa6ae0648a\" (UID: \"26441577-4d0c-4124-a970-72aa6ae0648a\") " Mar 18 13:32:36 crc kubenswrapper[4975]: I0318 13:32:36.609196 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26441577-4d0c-4124-a970-72aa6ae0648a-utilities" (OuterVolumeSpecName: "utilities") pod "26441577-4d0c-4124-a970-72aa6ae0648a" (UID: "26441577-4d0c-4124-a970-72aa6ae0648a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:32:36 crc kubenswrapper[4975]: I0318 13:32:36.614023 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26441577-4d0c-4124-a970-72aa6ae0648a-kube-api-access-kxr2s" (OuterVolumeSpecName: "kube-api-access-kxr2s") pod "26441577-4d0c-4124-a970-72aa6ae0648a" (UID: "26441577-4d0c-4124-a970-72aa6ae0648a"). InnerVolumeSpecName "kube-api-access-kxr2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:32:36 crc kubenswrapper[4975]: I0318 13:32:36.710469 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxr2s\" (UniqueName: \"kubernetes.io/projected/26441577-4d0c-4124-a970-72aa6ae0648a-kube-api-access-kxr2s\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:36 crc kubenswrapper[4975]: I0318 13:32:36.710507 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26441577-4d0c-4124-a970-72aa6ae0648a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:36 crc kubenswrapper[4975]: I0318 13:32:36.778336 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26441577-4d0c-4124-a970-72aa6ae0648a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26441577-4d0c-4124-a970-72aa6ae0648a" (UID: "26441577-4d0c-4124-a970-72aa6ae0648a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:32:36 crc kubenswrapper[4975]: I0318 13:32:36.812299 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26441577-4d0c-4124-a970-72aa6ae0648a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:32:37 crc kubenswrapper[4975]: I0318 13:32:37.023134 4975 generic.go:334] "Generic (PLEG): container finished" podID="26441577-4d0c-4124-a970-72aa6ae0648a" containerID="047da252aedb2be24732f640dcb610e5fb765903079031e8f9ff660302810715" exitCode=0 Mar 18 13:32:37 crc kubenswrapper[4975]: I0318 13:32:37.023216 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v2kpb" Mar 18 13:32:37 crc kubenswrapper[4975]: I0318 13:32:37.026219 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v2kpb" event={"ID":"26441577-4d0c-4124-a970-72aa6ae0648a","Type":"ContainerDied","Data":"047da252aedb2be24732f640dcb610e5fb765903079031e8f9ff660302810715"} Mar 18 13:32:37 crc kubenswrapper[4975]: I0318 13:32:37.026269 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v2kpb" event={"ID":"26441577-4d0c-4124-a970-72aa6ae0648a","Type":"ContainerDied","Data":"9839cab60d70a4299a6b2e6a4f21902de677c6713a15ea436cac015cb0580f08"} Mar 18 13:32:37 crc kubenswrapper[4975]: I0318 13:32:37.026320 4975 scope.go:117] "RemoveContainer" containerID="047da252aedb2be24732f640dcb610e5fb765903079031e8f9ff660302810715" Mar 18 13:32:37 crc kubenswrapper[4975]: I0318 13:32:37.054254 4975 scope.go:117] "RemoveContainer" containerID="71872f45f12a5fe7a154abe7e3595ba5a41276824effc1074b86c7dd41740761" Mar 18 13:32:37 crc kubenswrapper[4975]: I0318 13:32:37.067198 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v2kpb"] Mar 18 13:32:37 crc kubenswrapper[4975]: I0318 13:32:37.085760 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v2kpb"] Mar 18 13:32:37 crc kubenswrapper[4975]: I0318 13:32:37.085960 4975 scope.go:117] "RemoveContainer" containerID="9d810eeda237c15e6097a2b2d0b4f6cb6a7dd541153f8ca6a551729cac8002bf" Mar 18 13:32:37 crc kubenswrapper[4975]: I0318 13:32:37.126603 4975 scope.go:117] "RemoveContainer" containerID="047da252aedb2be24732f640dcb610e5fb765903079031e8f9ff660302810715" Mar 18 13:32:37 crc kubenswrapper[4975]: E0318 13:32:37.127174 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"047da252aedb2be24732f640dcb610e5fb765903079031e8f9ff660302810715\": container with ID starting with 047da252aedb2be24732f640dcb610e5fb765903079031e8f9ff660302810715 not found: ID does not exist" containerID="047da252aedb2be24732f640dcb610e5fb765903079031e8f9ff660302810715" Mar 18 13:32:37 crc kubenswrapper[4975]: I0318 13:32:37.127218 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"047da252aedb2be24732f640dcb610e5fb765903079031e8f9ff660302810715"} err="failed to get container status \"047da252aedb2be24732f640dcb610e5fb765903079031e8f9ff660302810715\": rpc error: code = NotFound desc = could not find container \"047da252aedb2be24732f640dcb610e5fb765903079031e8f9ff660302810715\": container with ID starting with 047da252aedb2be24732f640dcb610e5fb765903079031e8f9ff660302810715 not found: ID does not exist" Mar 18 13:32:37 crc kubenswrapper[4975]: I0318 13:32:37.127246 4975 scope.go:117] "RemoveContainer" containerID="71872f45f12a5fe7a154abe7e3595ba5a41276824effc1074b86c7dd41740761" Mar 18 13:32:37 crc kubenswrapper[4975]: E0318 13:32:37.127620 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71872f45f12a5fe7a154abe7e3595ba5a41276824effc1074b86c7dd41740761\": container with ID starting with 71872f45f12a5fe7a154abe7e3595ba5a41276824effc1074b86c7dd41740761 not found: ID does not exist" containerID="71872f45f12a5fe7a154abe7e3595ba5a41276824effc1074b86c7dd41740761" Mar 18 13:32:37 crc kubenswrapper[4975]: I0318 13:32:37.127675 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71872f45f12a5fe7a154abe7e3595ba5a41276824effc1074b86c7dd41740761"} err="failed to get container status \"71872f45f12a5fe7a154abe7e3595ba5a41276824effc1074b86c7dd41740761\": rpc error: code = NotFound desc = could not find container \"71872f45f12a5fe7a154abe7e3595ba5a41276824effc1074b86c7dd41740761\": container with ID starting with 71872f45f12a5fe7a154abe7e3595ba5a41276824effc1074b86c7dd41740761 not found: ID does not exist" Mar 18 13:32:37 crc kubenswrapper[4975]: I0318 13:32:37.127692 4975 scope.go:117] "RemoveContainer" containerID="9d810eeda237c15e6097a2b2d0b4f6cb6a7dd541153f8ca6a551729cac8002bf" Mar 18 13:32:37 crc kubenswrapper[4975]: E0318 13:32:37.128078 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d810eeda237c15e6097a2b2d0b4f6cb6a7dd541153f8ca6a551729cac8002bf\": container with ID starting with 9d810eeda237c15e6097a2b2d0b4f6cb6a7dd541153f8ca6a551729cac8002bf not found: ID does not exist" containerID="9d810eeda237c15e6097a2b2d0b4f6cb6a7dd541153f8ca6a551729cac8002bf" Mar 18 13:32:37 crc kubenswrapper[4975]: I0318 13:32:37.128100 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d810eeda237c15e6097a2b2d0b4f6cb6a7dd541153f8ca6a551729cac8002bf"} err="failed to get container status \"9d810eeda237c15e6097a2b2d0b4f6cb6a7dd541153f8ca6a551729cac8002bf\": rpc error: code = NotFound desc = could not find container \"9d810eeda237c15e6097a2b2d0b4f6cb6a7dd541153f8ca6a551729cac8002bf\": container with ID starting with 9d810eeda237c15e6097a2b2d0b4f6cb6a7dd541153f8ca6a551729cac8002bf not found: ID does not exist" Mar 18 13:32:39 crc kubenswrapper[4975]: I0318 13:32:39.027556 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26441577-4d0c-4124-a970-72aa6ae0648a" path="/var/lib/kubelet/pods/26441577-4d0c-4124-a970-72aa6ae0648a/volumes" Mar 18 13:32:50 crc kubenswrapper[4975]: I0318 13:32:50.168493 4975 scope.go:117] "RemoveContainer" containerID="3ae85f7d5defcd8a370bbf0501aca571f98959bc9b03147e8294c152e556eab4" Mar 18 13:32:55 crc kubenswrapper[4975]: I0318 13:32:55.539213 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:32:55 crc kubenswrapper[4975]: I0318 13:32:55.539806 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:33:25 crc kubenswrapper[4975]: I0318 13:33:25.539053 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:33:25 crc kubenswrapper[4975]: I0318 13:33:25.539708 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.319530 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8x2z5"] Mar 18 13:33:44 crc kubenswrapper[4975]: E0318 13:33:44.320553 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26441577-4d0c-4124-a970-72aa6ae0648a" containerName="extract-content" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.320567 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="26441577-4d0c-4124-a970-72aa6ae0648a" containerName="extract-content" Mar 18 13:33:44 crc kubenswrapper[4975]: E0318 13:33:44.320577 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26441577-4d0c-4124-a970-72aa6ae0648a" containerName="registry-server" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.320583 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="26441577-4d0c-4124-a970-72aa6ae0648a" containerName="registry-server" Mar 18 13:33:44 crc kubenswrapper[4975]: E0318 13:33:44.320601 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26441577-4d0c-4124-a970-72aa6ae0648a" containerName="extract-utilities" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.320607 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="26441577-4d0c-4124-a970-72aa6ae0648a" containerName="extract-utilities" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.320803 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="26441577-4d0c-4124-a970-72aa6ae0648a" containerName="registry-server" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.322162 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.328114 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8x2z5"] Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.478587 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/925f6a74-0f27-4788-8650-18514e9dec0f-catalog-content\") pod \"redhat-marketplace-8x2z5\" (UID: \"925f6a74-0f27-4788-8650-18514e9dec0f\") " pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.478802 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/925f6a74-0f27-4788-8650-18514e9dec0f-utilities\") pod \"redhat-marketplace-8x2z5\" (UID: \"925f6a74-0f27-4788-8650-18514e9dec0f\") " pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.479175 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvlzv\" (UniqueName: \"kubernetes.io/projected/925f6a74-0f27-4788-8650-18514e9dec0f-kube-api-access-gvlzv\") pod \"redhat-marketplace-8x2z5\" (UID: \"925f6a74-0f27-4788-8650-18514e9dec0f\") " pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.581234 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/925f6a74-0f27-4788-8650-18514e9dec0f-catalog-content\") pod \"redhat-marketplace-8x2z5\" (UID: \"925f6a74-0f27-4788-8650-18514e9dec0f\") " pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.581350 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/925f6a74-0f27-4788-8650-18514e9dec0f-utilities\") pod \"redhat-marketplace-8x2z5\" (UID: \"925f6a74-0f27-4788-8650-18514e9dec0f\") " pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.581430 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvlzv\" (UniqueName: \"kubernetes.io/projected/925f6a74-0f27-4788-8650-18514e9dec0f-kube-api-access-gvlzv\") pod \"redhat-marketplace-8x2z5\" (UID: \"925f6a74-0f27-4788-8650-18514e9dec0f\") " pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.582065 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/925f6a74-0f27-4788-8650-18514e9dec0f-catalog-content\") pod \"redhat-marketplace-8x2z5\" (UID: \"925f6a74-0f27-4788-8650-18514e9dec0f\") " pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.582155 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/925f6a74-0f27-4788-8650-18514e9dec0f-utilities\") pod \"redhat-marketplace-8x2z5\" (UID: \"925f6a74-0f27-4788-8650-18514e9dec0f\") " pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.865339 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvlzv\" (UniqueName: \"kubernetes.io/projected/925f6a74-0f27-4788-8650-18514e9dec0f-kube-api-access-gvlzv\") pod \"redhat-marketplace-8x2z5\" (UID: \"925f6a74-0f27-4788-8650-18514e9dec0f\") " pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:44 crc kubenswrapper[4975]: I0318 13:33:44.952206 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:45 crc kubenswrapper[4975]: I0318 13:33:45.404901 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8x2z5"] Mar 18 13:33:45 crc kubenswrapper[4975]: I0318 13:33:45.657641 4975 generic.go:334] "Generic (PLEG): container finished" podID="925f6a74-0f27-4788-8650-18514e9dec0f" containerID="ab0afaf2afcd30efdc72d7f6fca01f14545fe87e6643780150424f81de69db2f" exitCode=0 Mar 18 13:33:45 crc kubenswrapper[4975]: I0318 13:33:45.657711 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8x2z5" event={"ID":"925f6a74-0f27-4788-8650-18514e9dec0f","Type":"ContainerDied","Data":"ab0afaf2afcd30efdc72d7f6fca01f14545fe87e6643780150424f81de69db2f"} Mar 18 13:33:45 crc kubenswrapper[4975]: I0318 13:33:45.657768 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8x2z5" event={"ID":"925f6a74-0f27-4788-8650-18514e9dec0f","Type":"ContainerStarted","Data":"92e337fa0da3e211460fa07cb459dd0619b23305f0f827a908787193615b6eaa"} Mar 18 13:33:47 crc kubenswrapper[4975]: I0318 13:33:47.675164 4975 generic.go:334] "Generic (PLEG): container finished" podID="925f6a74-0f27-4788-8650-18514e9dec0f" containerID="cf0c03e8ac40b0814c14cb4df4c11c57f49f059588b87732208ce2112b1a420b" exitCode=0 Mar 18 13:33:47 crc kubenswrapper[4975]: I0318 13:33:47.675222 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8x2z5" event={"ID":"925f6a74-0f27-4788-8650-18514e9dec0f","Type":"ContainerDied","Data":"cf0c03e8ac40b0814c14cb4df4c11c57f49f059588b87732208ce2112b1a420b"} Mar 18 13:33:48 crc kubenswrapper[4975]: I0318 13:33:48.693367 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8x2z5" event={"ID":"925f6a74-0f27-4788-8650-18514e9dec0f","Type":"ContainerStarted","Data":"b68a14017484cf56da7815781e73276c00e65ab3af6ed4e462e5f1646f895db4"} Mar 18 13:33:48 crc kubenswrapper[4975]: I0318 13:33:48.719610 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8x2z5" podStartSLOduration=2.236553692 podStartE2EDuration="4.719533458s" podCreationTimestamp="2026-03-18 13:33:44 +0000 UTC" firstStartedPulling="2026-03-18 13:33:45.659435834 +0000 UTC m=+5011.373836413" lastFinishedPulling="2026-03-18 13:33:48.14241559 +0000 UTC m=+5013.856816179" observedRunningTime="2026-03-18 13:33:48.714238245 +0000 UTC m=+5014.428638814" watchObservedRunningTime="2026-03-18 13:33:48.719533458 +0000 UTC m=+5014.433934037" Mar 18 13:33:54 crc kubenswrapper[4975]: I0318 13:33:54.953467 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:54 crc kubenswrapper[4975]: I0318 13:33:54.953776 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:55 crc kubenswrapper[4975]: I0318 13:33:55.002257 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:55 crc kubenswrapper[4975]: I0318 13:33:55.539439 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:33:55 crc kubenswrapper[4975]: I0318 13:33:55.539519 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:33:55 crc kubenswrapper[4975]: I0318 13:33:55.539568 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 13:33:55 crc kubenswrapper[4975]: I0318 13:33:55.540761 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:33:55 crc kubenswrapper[4975]: I0318 13:33:55.540891 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" gracePeriod=600 Mar 18 13:33:55 crc kubenswrapper[4975]: E0318 13:33:55.672486 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:33:55 crc kubenswrapper[4975]: I0318 13:33:55.761962 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" exitCode=0 Mar 18 13:33:55 crc kubenswrapper[4975]: I0318 13:33:55.762033 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519"} Mar 18 13:33:55 crc kubenswrapper[4975]: I0318 13:33:55.762110 4975 scope.go:117] "RemoveContainer" containerID="2cf67ccc567d76aff7b36e5a10d5c9daae556eef49e5ae8a0c6bb48c935569fb" Mar 18 13:33:55 crc kubenswrapper[4975]: I0318 13:33:55.762719 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:33:55 crc kubenswrapper[4975]: E0318 13:33:55.763006 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:33:55 crc kubenswrapper[4975]: I0318 13:33:55.825851 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:55 crc kubenswrapper[4975]: I0318 13:33:55.880705 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8x2z5"] Mar 18 13:33:57 crc kubenswrapper[4975]: I0318 13:33:57.785906 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8x2z5" podUID="925f6a74-0f27-4788-8650-18514e9dec0f" containerName="registry-server" containerID="cri-o://b68a14017484cf56da7815781e73276c00e65ab3af6ed4e462e5f1646f895db4" gracePeriod=2 Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.230928 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.352207 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/925f6a74-0f27-4788-8650-18514e9dec0f-utilities\") pod \"925f6a74-0f27-4788-8650-18514e9dec0f\" (UID: \"925f6a74-0f27-4788-8650-18514e9dec0f\") " Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.352304 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/925f6a74-0f27-4788-8650-18514e9dec0f-catalog-content\") pod \"925f6a74-0f27-4788-8650-18514e9dec0f\" (UID: \"925f6a74-0f27-4788-8650-18514e9dec0f\") " Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.352350 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvlzv\" (UniqueName: \"kubernetes.io/projected/925f6a74-0f27-4788-8650-18514e9dec0f-kube-api-access-gvlzv\") pod \"925f6a74-0f27-4788-8650-18514e9dec0f\" (UID: \"925f6a74-0f27-4788-8650-18514e9dec0f\") " Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.353617 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/925f6a74-0f27-4788-8650-18514e9dec0f-utilities" (OuterVolumeSpecName: "utilities") pod "925f6a74-0f27-4788-8650-18514e9dec0f" (UID: "925f6a74-0f27-4788-8650-18514e9dec0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.361413 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f6a74-0f27-4788-8650-18514e9dec0f-kube-api-access-gvlzv" (OuterVolumeSpecName: "kube-api-access-gvlzv") pod "925f6a74-0f27-4788-8650-18514e9dec0f" (UID: "925f6a74-0f27-4788-8650-18514e9dec0f"). InnerVolumeSpecName "kube-api-access-gvlzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.388352 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/925f6a74-0f27-4788-8650-18514e9dec0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "925f6a74-0f27-4788-8650-18514e9dec0f" (UID: "925f6a74-0f27-4788-8650-18514e9dec0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.454489 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/925f6a74-0f27-4788-8650-18514e9dec0f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.454528 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/925f6a74-0f27-4788-8650-18514e9dec0f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.454543 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvlzv\" (UniqueName: \"kubernetes.io/projected/925f6a74-0f27-4788-8650-18514e9dec0f-kube-api-access-gvlzv\") on node \"crc\" DevicePath \"\"" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.794512 4975 generic.go:334] "Generic (PLEG): container finished" podID="925f6a74-0f27-4788-8650-18514e9dec0f" containerID="b68a14017484cf56da7815781e73276c00e65ab3af6ed4e462e5f1646f895db4" exitCode=0 Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.794580 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8x2z5" event={"ID":"925f6a74-0f27-4788-8650-18514e9dec0f","Type":"ContainerDied","Data":"b68a14017484cf56da7815781e73276c00e65ab3af6ed4e462e5f1646f895db4"} Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.794657 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8x2z5" event={"ID":"925f6a74-0f27-4788-8650-18514e9dec0f","Type":"ContainerDied","Data":"92e337fa0da3e211460fa07cb459dd0619b23305f0f827a908787193615b6eaa"} Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.794677 4975 scope.go:117] "RemoveContainer" containerID="b68a14017484cf56da7815781e73276c00e65ab3af6ed4e462e5f1646f895db4" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.794606 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8x2z5" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.822577 4975 scope.go:117] "RemoveContainer" containerID="cf0c03e8ac40b0814c14cb4df4c11c57f49f059588b87732208ce2112b1a420b" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.829971 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8x2z5"] Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.838148 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8x2z5"] Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.854364 4975 scope.go:117] "RemoveContainer" containerID="ab0afaf2afcd30efdc72d7f6fca01f14545fe87e6643780150424f81de69db2f" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.887891 4975 scope.go:117] "RemoveContainer" containerID="b68a14017484cf56da7815781e73276c00e65ab3af6ed4e462e5f1646f895db4" Mar 18 13:33:58 crc kubenswrapper[4975]: E0318 13:33:58.888391 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b68a14017484cf56da7815781e73276c00e65ab3af6ed4e462e5f1646f895db4\": container with ID starting with b68a14017484cf56da7815781e73276c00e65ab3af6ed4e462e5f1646f895db4 not found: ID does not exist" containerID="b68a14017484cf56da7815781e73276c00e65ab3af6ed4e462e5f1646f895db4" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.888438 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b68a14017484cf56da7815781e73276c00e65ab3af6ed4e462e5f1646f895db4"} err="failed to get container status \"b68a14017484cf56da7815781e73276c00e65ab3af6ed4e462e5f1646f895db4\": rpc error: code = NotFound desc = could not find container \"b68a14017484cf56da7815781e73276c00e65ab3af6ed4e462e5f1646f895db4\": container with ID starting with b68a14017484cf56da7815781e73276c00e65ab3af6ed4e462e5f1646f895db4 not found: ID does not exist" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.888464 4975 scope.go:117] "RemoveContainer" containerID="cf0c03e8ac40b0814c14cb4df4c11c57f49f059588b87732208ce2112b1a420b" Mar 18 13:33:58 crc kubenswrapper[4975]: E0318 13:33:58.888796 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0c03e8ac40b0814c14cb4df4c11c57f49f059588b87732208ce2112b1a420b\": container with ID starting with cf0c03e8ac40b0814c14cb4df4c11c57f49f059588b87732208ce2112b1a420b not found: ID does not exist" containerID="cf0c03e8ac40b0814c14cb4df4c11c57f49f059588b87732208ce2112b1a420b" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.888824 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0c03e8ac40b0814c14cb4df4c11c57f49f059588b87732208ce2112b1a420b"} err="failed to get container status \"cf0c03e8ac40b0814c14cb4df4c11c57f49f059588b87732208ce2112b1a420b\": rpc error: code = NotFound desc = could not find container \"cf0c03e8ac40b0814c14cb4df4c11c57f49f059588b87732208ce2112b1a420b\": container with ID starting with cf0c03e8ac40b0814c14cb4df4c11c57f49f059588b87732208ce2112b1a420b not found: ID does not exist" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.888844 4975 scope.go:117] "RemoveContainer" containerID="ab0afaf2afcd30efdc72d7f6fca01f14545fe87e6643780150424f81de69db2f" Mar 18 13:33:58 crc kubenswrapper[4975]: E0318 13:33:58.889167 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab0afaf2afcd30efdc72d7f6fca01f14545fe87e6643780150424f81de69db2f\": container with ID starting with ab0afaf2afcd30efdc72d7f6fca01f14545fe87e6643780150424f81de69db2f not found: ID does not exist" containerID="ab0afaf2afcd30efdc72d7f6fca01f14545fe87e6643780150424f81de69db2f" Mar 18 13:33:58 crc kubenswrapper[4975]: I0318 13:33:58.889193 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab0afaf2afcd30efdc72d7f6fca01f14545fe87e6643780150424f81de69db2f"} err="failed to get container status \"ab0afaf2afcd30efdc72d7f6fca01f14545fe87e6643780150424f81de69db2f\": rpc error: code = NotFound desc = could not find container \"ab0afaf2afcd30efdc72d7f6fca01f14545fe87e6643780150424f81de69db2f\": container with ID starting with ab0afaf2afcd30efdc72d7f6fca01f14545fe87e6643780150424f81de69db2f not found: ID does not exist" Mar 18 13:33:59 crc kubenswrapper[4975]: I0318 13:33:59.029159 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f6a74-0f27-4788-8650-18514e9dec0f" path="/var/lib/kubelet/pods/925f6a74-0f27-4788-8650-18514e9dec0f/volumes" Mar 18 13:34:00 crc kubenswrapper[4975]: I0318 13:34:00.147619 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564014-d8wzl"] Mar 18 13:34:00 crc kubenswrapper[4975]: E0318 13:34:00.148452 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="925f6a74-0f27-4788-8650-18514e9dec0f" containerName="extract-content" Mar 18 13:34:00 crc kubenswrapper[4975]: I0318 13:34:00.148468 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="925f6a74-0f27-4788-8650-18514e9dec0f" containerName="extract-content" Mar 18 13:34:00 crc kubenswrapper[4975]: E0318 13:34:00.148486 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="925f6a74-0f27-4788-8650-18514e9dec0f" containerName="registry-server" Mar 18 13:34:00 crc kubenswrapper[4975]: I0318 13:34:00.148491 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="925f6a74-0f27-4788-8650-18514e9dec0f" containerName="registry-server" Mar 18 13:34:00 crc kubenswrapper[4975]: E0318 13:34:00.148508 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="925f6a74-0f27-4788-8650-18514e9dec0f" containerName="extract-utilities" Mar 18 13:34:00 crc kubenswrapper[4975]: I0318 13:34:00.148515 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="925f6a74-0f27-4788-8650-18514e9dec0f" containerName="extract-utilities" Mar 18 13:34:00 crc kubenswrapper[4975]: I0318 13:34:00.148708 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="925f6a74-0f27-4788-8650-18514e9dec0f" containerName="registry-server" Mar 18 13:34:00 crc kubenswrapper[4975]: I0318 13:34:00.149451 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-d8wzl" Mar 18 13:34:00 crc kubenswrapper[4975]: I0318 13:34:00.153680 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:34:00 crc kubenswrapper[4975]: I0318 13:34:00.156248 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:34:00 crc kubenswrapper[4975]: I0318 13:34:00.156326 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:34:00 crc kubenswrapper[4975]: I0318 13:34:00.160110 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-d8wzl"] Mar 18 13:34:00 crc kubenswrapper[4975]: I0318 13:34:00.288595 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnnfk\" (UniqueName: \"kubernetes.io/projected/39d1114b-18c7-4f16-b805-e96bdb1adbc5-kube-api-access-fnnfk\") pod \"auto-csr-approver-29564014-d8wzl\" (UID: \"39d1114b-18c7-4f16-b805-e96bdb1adbc5\") " pod="openshift-infra/auto-csr-approver-29564014-d8wzl" Mar 18 13:34:00 crc kubenswrapper[4975]: I0318 13:34:00.390060 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnnfk\" (UniqueName: \"kubernetes.io/projected/39d1114b-18c7-4f16-b805-e96bdb1adbc5-kube-api-access-fnnfk\") pod \"auto-csr-approver-29564014-d8wzl\" (UID: \"39d1114b-18c7-4f16-b805-e96bdb1adbc5\") " pod="openshift-infra/auto-csr-approver-29564014-d8wzl" Mar 18 13:34:00 crc kubenswrapper[4975]: I0318 13:34:00.408059 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnnfk\" (UniqueName: \"kubernetes.io/projected/39d1114b-18c7-4f16-b805-e96bdb1adbc5-kube-api-access-fnnfk\") pod \"auto-csr-approver-29564014-d8wzl\" (UID: \"39d1114b-18c7-4f16-b805-e96bdb1adbc5\") " pod="openshift-infra/auto-csr-approver-29564014-d8wzl" Mar 18 13:34:00 crc kubenswrapper[4975]: I0318 13:34:00.469231 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-d8wzl" Mar 18 13:34:00 crc kubenswrapper[4975]: I0318 13:34:00.910995 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-d8wzl"] Mar 18 13:34:00 crc kubenswrapper[4975]: W0318 13:34:00.914684 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39d1114b_18c7_4f16_b805_e96bdb1adbc5.slice/crio-2d16549a67fe8fbff305ec9f156e2274e13887be8629c782fe8ad353231c5462 WatchSource:0}: Error finding container 2d16549a67fe8fbff305ec9f156e2274e13887be8629c782fe8ad353231c5462: Status 404 returned error can't find the container with id 2d16549a67fe8fbff305ec9f156e2274e13887be8629c782fe8ad353231c5462 Mar 18 13:34:01 crc kubenswrapper[4975]: I0318 13:34:01.827129 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564014-d8wzl" event={"ID":"39d1114b-18c7-4f16-b805-e96bdb1adbc5","Type":"ContainerStarted","Data":"2d16549a67fe8fbff305ec9f156e2274e13887be8629c782fe8ad353231c5462"} Mar 18 13:34:02 crc kubenswrapper[4975]: I0318 13:34:02.204087 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2qb4w"] Mar 18 13:34:02 crc kubenswrapper[4975]: I0318 13:34:02.206904 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:02 crc kubenswrapper[4975]: I0318 13:34:02.214306 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2qb4w"] Mar 18 13:34:02 crc kubenswrapper[4975]: I0318 13:34:02.235895 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a6d27eb-fc34-44f5-bd82-87852e47ec71-utilities\") pod \"certified-operators-2qb4w\" (UID: \"8a6d27eb-fc34-44f5-bd82-87852e47ec71\") " pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:02 crc kubenswrapper[4975]: I0318 13:34:02.236093 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a6d27eb-fc34-44f5-bd82-87852e47ec71-catalog-content\") pod \"certified-operators-2qb4w\" (UID: \"8a6d27eb-fc34-44f5-bd82-87852e47ec71\") " pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:02 crc kubenswrapper[4975]: I0318 13:34:02.237354 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w68zq\" (UniqueName: \"kubernetes.io/projected/8a6d27eb-fc34-44f5-bd82-87852e47ec71-kube-api-access-w68zq\") pod \"certified-operators-2qb4w\" (UID: \"8a6d27eb-fc34-44f5-bd82-87852e47ec71\") " pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:02 crc kubenswrapper[4975]: I0318 13:34:02.340708 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w68zq\" (UniqueName: \"kubernetes.io/projected/8a6d27eb-fc34-44f5-bd82-87852e47ec71-kube-api-access-w68zq\") pod \"certified-operators-2qb4w\" (UID: \"8a6d27eb-fc34-44f5-bd82-87852e47ec71\") " pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:02 crc kubenswrapper[4975]: I0318 13:34:02.340782 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a6d27eb-fc34-44f5-bd82-87852e47ec71-utilities\") pod \"certified-operators-2qb4w\" (UID: \"8a6d27eb-fc34-44f5-bd82-87852e47ec71\") " pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:02 crc kubenswrapper[4975]: I0318 13:34:02.340830 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a6d27eb-fc34-44f5-bd82-87852e47ec71-catalog-content\") pod \"certified-operators-2qb4w\" (UID: \"8a6d27eb-fc34-44f5-bd82-87852e47ec71\") " pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:02 crc kubenswrapper[4975]: I0318 13:34:02.341255 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a6d27eb-fc34-44f5-bd82-87852e47ec71-catalog-content\") pod \"certified-operators-2qb4w\" (UID: \"8a6d27eb-fc34-44f5-bd82-87852e47ec71\") " pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:02 crc kubenswrapper[4975]: I0318 13:34:02.341836 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a6d27eb-fc34-44f5-bd82-87852e47ec71-utilities\") pod \"certified-operators-2qb4w\" (UID: \"8a6d27eb-fc34-44f5-bd82-87852e47ec71\") " pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:02 crc kubenswrapper[4975]: I0318 13:34:02.359812 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w68zq\" (UniqueName: \"kubernetes.io/projected/8a6d27eb-fc34-44f5-bd82-87852e47ec71-kube-api-access-w68zq\") pod \"certified-operators-2qb4w\" (UID: \"8a6d27eb-fc34-44f5-bd82-87852e47ec71\") " pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:02 crc kubenswrapper[4975]: I0318 13:34:02.534906 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:02 crc kubenswrapper[4975]: I0318 13:34:02.851665 4975 generic.go:334] "Generic (PLEG): container finished" podID="39d1114b-18c7-4f16-b805-e96bdb1adbc5" containerID="eeea74c5fc61293f5ef084cceb2be71eaafac98c81ddaa9ccd5d6db70744671c" exitCode=0 Mar 18 13:34:02 crc kubenswrapper[4975]: I0318 13:34:02.851979 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564014-d8wzl" event={"ID":"39d1114b-18c7-4f16-b805-e96bdb1adbc5","Type":"ContainerDied","Data":"eeea74c5fc61293f5ef084cceb2be71eaafac98c81ddaa9ccd5d6db70744671c"} Mar 18 13:34:03 crc kubenswrapper[4975]: I0318 13:34:03.161729 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2qb4w"] Mar 18 13:34:03 crc kubenswrapper[4975]: I0318 13:34:03.863610 4975 generic.go:334] "Generic (PLEG): container finished" podID="8a6d27eb-fc34-44f5-bd82-87852e47ec71" containerID="0cad9fadbed3309d51e88ce1cede60759fe83409262b539dbf9106a646ae6a3b" exitCode=0 Mar 18 13:34:03 crc kubenswrapper[4975]: I0318 13:34:03.863722 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qb4w" event={"ID":"8a6d27eb-fc34-44f5-bd82-87852e47ec71","Type":"ContainerDied","Data":"0cad9fadbed3309d51e88ce1cede60759fe83409262b539dbf9106a646ae6a3b"} Mar 18 13:34:03 crc kubenswrapper[4975]: I0318 13:34:03.863977 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qb4w" event={"ID":"8a6d27eb-fc34-44f5-bd82-87852e47ec71","Type":"ContainerStarted","Data":"b4c4c2aa2f949c4f5852ae289e528445b2ee8187364b65b6bd2b73db7c2d399a"} Mar 18 13:34:04 crc kubenswrapper[4975]: I0318 13:34:04.195417 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-d8wzl" Mar 18 13:34:04 crc kubenswrapper[4975]: I0318 13:34:04.384565 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnnfk\" (UniqueName: \"kubernetes.io/projected/39d1114b-18c7-4f16-b805-e96bdb1adbc5-kube-api-access-fnnfk\") pod \"39d1114b-18c7-4f16-b805-e96bdb1adbc5\" (UID: \"39d1114b-18c7-4f16-b805-e96bdb1adbc5\") " Mar 18 13:34:04 crc kubenswrapper[4975]: I0318 13:34:04.390234 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39d1114b-18c7-4f16-b805-e96bdb1adbc5-kube-api-access-fnnfk" (OuterVolumeSpecName: "kube-api-access-fnnfk") pod "39d1114b-18c7-4f16-b805-e96bdb1adbc5" (UID: "39d1114b-18c7-4f16-b805-e96bdb1adbc5"). InnerVolumeSpecName "kube-api-access-fnnfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:34:04 crc kubenswrapper[4975]: I0318 13:34:04.486505 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnnfk\" (UniqueName: \"kubernetes.io/projected/39d1114b-18c7-4f16-b805-e96bdb1adbc5-kube-api-access-fnnfk\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:04 crc kubenswrapper[4975]: I0318 13:34:04.878954 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564014-d8wzl" event={"ID":"39d1114b-18c7-4f16-b805-e96bdb1adbc5","Type":"ContainerDied","Data":"2d16549a67fe8fbff305ec9f156e2274e13887be8629c782fe8ad353231c5462"} Mar 18 13:34:04 crc kubenswrapper[4975]: I0318 13:34:04.879011 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d16549a67fe8fbff305ec9f156e2274e13887be8629c782fe8ad353231c5462" Mar 18 13:34:04 crc kubenswrapper[4975]: I0318 13:34:04.879815 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564014-d8wzl" Mar 18 13:34:04 crc kubenswrapper[4975]: I0318 13:34:04.881049 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qb4w" event={"ID":"8a6d27eb-fc34-44f5-bd82-87852e47ec71","Type":"ContainerStarted","Data":"3501889b99580fca390a56d83ad95dc1aaa1d81d3d2cfbd6596bc80dfbb49283"} Mar 18 13:34:05 crc kubenswrapper[4975]: I0318 13:34:05.266341 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-svwx5"] Mar 18 13:34:05 crc kubenswrapper[4975]: I0318 13:34:05.275010 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564008-svwx5"] Mar 18 13:34:05 crc kubenswrapper[4975]: I0318 13:34:05.891559 4975 generic.go:334] "Generic (PLEG): container finished" podID="8a6d27eb-fc34-44f5-bd82-87852e47ec71" containerID="3501889b99580fca390a56d83ad95dc1aaa1d81d3d2cfbd6596bc80dfbb49283" exitCode=0 Mar 18 13:34:05 crc kubenswrapper[4975]: I0318 13:34:05.891604 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qb4w" event={"ID":"8a6d27eb-fc34-44f5-bd82-87852e47ec71","Type":"ContainerDied","Data":"3501889b99580fca390a56d83ad95dc1aaa1d81d3d2cfbd6596bc80dfbb49283"} Mar 18 13:34:06 crc kubenswrapper[4975]: I0318 13:34:06.903069 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qb4w" event={"ID":"8a6d27eb-fc34-44f5-bd82-87852e47ec71","Type":"ContainerStarted","Data":"77e3069560518cf7547d04f5115f1ac2954fbcdfc16127f87dd2a454676fa241"} Mar 18 13:34:06 crc kubenswrapper[4975]: I0318 13:34:06.931425 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2qb4w" podStartSLOduration=2.373855014 podStartE2EDuration="4.931408256s" podCreationTimestamp="2026-03-18 13:34:02 +0000 UTC" firstStartedPulling="2026-03-18 13:34:03.865651739 +0000 UTC m=+5029.580052318" lastFinishedPulling="2026-03-18 13:34:06.423204961 +0000 UTC m=+5032.137605560" observedRunningTime="2026-03-18 13:34:06.921617151 +0000 UTC m=+5032.636017740" watchObservedRunningTime="2026-03-18 13:34:06.931408256 +0000 UTC m=+5032.645808835" Mar 18 13:34:07 crc kubenswrapper[4975]: I0318 13:34:07.017580 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:34:07 crc kubenswrapper[4975]: E0318 13:34:07.018091 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:34:07 crc kubenswrapper[4975]: I0318 13:34:07.027079 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf5c26ed-e766-4702-9958-69b7d29af32f" path="/var/lib/kubelet/pods/cf5c26ed-e766-4702-9958-69b7d29af32f/volumes" Mar 18 13:34:12 crc kubenswrapper[4975]: I0318 13:34:12.538473 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:12 crc kubenswrapper[4975]: I0318 13:34:12.539120 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:12 crc kubenswrapper[4975]: I0318 13:34:12.826477 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:13 crc kubenswrapper[4975]: I0318 13:34:13.001160 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:13 crc kubenswrapper[4975]: I0318 13:34:13.058761 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2qb4w"] Mar 18 13:34:14 crc kubenswrapper[4975]: I0318 13:34:14.980072 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2qb4w" podUID="8a6d27eb-fc34-44f5-bd82-87852e47ec71" containerName="registry-server" containerID="cri-o://77e3069560518cf7547d04f5115f1ac2954fbcdfc16127f87dd2a454676fa241" gracePeriod=2 Mar 18 13:34:15 crc kubenswrapper[4975]: I0318 13:34:15.407922 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:15 crc kubenswrapper[4975]: I0318 13:34:15.601352 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a6d27eb-fc34-44f5-bd82-87852e47ec71-utilities\") pod \"8a6d27eb-fc34-44f5-bd82-87852e47ec71\" (UID: \"8a6d27eb-fc34-44f5-bd82-87852e47ec71\") " Mar 18 13:34:15 crc kubenswrapper[4975]: I0318 13:34:15.601436 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w68zq\" (UniqueName: \"kubernetes.io/projected/8a6d27eb-fc34-44f5-bd82-87852e47ec71-kube-api-access-w68zq\") pod \"8a6d27eb-fc34-44f5-bd82-87852e47ec71\" (UID: \"8a6d27eb-fc34-44f5-bd82-87852e47ec71\") " Mar 18 13:34:15 crc kubenswrapper[4975]: I0318 13:34:15.601579 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a6d27eb-fc34-44f5-bd82-87852e47ec71-catalog-content\") pod \"8a6d27eb-fc34-44f5-bd82-87852e47ec71\" (UID: \"8a6d27eb-fc34-44f5-bd82-87852e47ec71\") " Mar 18 13:34:15 crc kubenswrapper[4975]: I0318 13:34:15.602397 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a6d27eb-fc34-44f5-bd82-87852e47ec71-utilities" (OuterVolumeSpecName: "utilities") pod "8a6d27eb-fc34-44f5-bd82-87852e47ec71" (UID: "8a6d27eb-fc34-44f5-bd82-87852e47ec71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:34:15 crc kubenswrapper[4975]: I0318 13:34:15.607959 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a6d27eb-fc34-44f5-bd82-87852e47ec71-kube-api-access-w68zq" (OuterVolumeSpecName: "kube-api-access-w68zq") pod "8a6d27eb-fc34-44f5-bd82-87852e47ec71" (UID: "8a6d27eb-fc34-44f5-bd82-87852e47ec71"). InnerVolumeSpecName "kube-api-access-w68zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:34:15 crc kubenswrapper[4975]: I0318 13:34:15.704905 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a6d27eb-fc34-44f5-bd82-87852e47ec71-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:15 crc kubenswrapper[4975]: I0318 13:34:15.704983 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w68zq\" (UniqueName: \"kubernetes.io/projected/8a6d27eb-fc34-44f5-bd82-87852e47ec71-kube-api-access-w68zq\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:15 crc kubenswrapper[4975]: I0318 13:34:15.920506 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a6d27eb-fc34-44f5-bd82-87852e47ec71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a6d27eb-fc34-44f5-bd82-87852e47ec71" (UID: "8a6d27eb-fc34-44f5-bd82-87852e47ec71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:34:15 crc kubenswrapper[4975]: I0318 13:34:15.990220 4975 generic.go:334] "Generic (PLEG): container finished" podID="8a6d27eb-fc34-44f5-bd82-87852e47ec71" containerID="77e3069560518cf7547d04f5115f1ac2954fbcdfc16127f87dd2a454676fa241" exitCode=0 Mar 18 13:34:15 crc kubenswrapper[4975]: I0318 13:34:15.990260 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qb4w" event={"ID":"8a6d27eb-fc34-44f5-bd82-87852e47ec71","Type":"ContainerDied","Data":"77e3069560518cf7547d04f5115f1ac2954fbcdfc16127f87dd2a454676fa241"} Mar 18 13:34:15 crc kubenswrapper[4975]: I0318 13:34:15.990287 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qb4w" event={"ID":"8a6d27eb-fc34-44f5-bd82-87852e47ec71","Type":"ContainerDied","Data":"b4c4c2aa2f949c4f5852ae289e528445b2ee8187364b65b6bd2b73db7c2d399a"} Mar 18 13:34:15 crc kubenswrapper[4975]: I0318 13:34:15.990305 4975 scope.go:117] "RemoveContainer" containerID="77e3069560518cf7547d04f5115f1ac2954fbcdfc16127f87dd2a454676fa241" Mar 18 13:34:15 crc kubenswrapper[4975]: I0318 13:34:15.990355 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qb4w" Mar 18 13:34:16 crc kubenswrapper[4975]: I0318 13:34:16.011610 4975 scope.go:117] "RemoveContainer" containerID="3501889b99580fca390a56d83ad95dc1aaa1d81d3d2cfbd6596bc80dfbb49283" Mar 18 13:34:16 crc kubenswrapper[4975]: I0318 13:34:16.011976 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a6d27eb-fc34-44f5-bd82-87852e47ec71-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:34:16 crc kubenswrapper[4975]: I0318 13:34:16.041136 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2qb4w"] Mar 18 13:34:16 crc kubenswrapper[4975]: I0318 13:34:16.052774 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2qb4w"] Mar 18 13:34:16 crc kubenswrapper[4975]: I0318 13:34:16.071550 4975 scope.go:117] "RemoveContainer" containerID="0cad9fadbed3309d51e88ce1cede60759fe83409262b539dbf9106a646ae6a3b" Mar 18 13:34:16 crc kubenswrapper[4975]: I0318 13:34:16.093996 4975 scope.go:117] "RemoveContainer" containerID="77e3069560518cf7547d04f5115f1ac2954fbcdfc16127f87dd2a454676fa241" Mar 18 13:34:16 crc kubenswrapper[4975]: E0318 13:34:16.094652 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77e3069560518cf7547d04f5115f1ac2954fbcdfc16127f87dd2a454676fa241\": container with ID starting with 77e3069560518cf7547d04f5115f1ac2954fbcdfc16127f87dd2a454676fa241 not found: ID does not exist" containerID="77e3069560518cf7547d04f5115f1ac2954fbcdfc16127f87dd2a454676fa241" Mar 18 13:34:16 crc kubenswrapper[4975]: I0318 13:34:16.094703 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77e3069560518cf7547d04f5115f1ac2954fbcdfc16127f87dd2a454676fa241"} err="failed to get container status \"77e3069560518cf7547d04f5115f1ac2954fbcdfc16127f87dd2a454676fa241\": rpc error: code = NotFound desc = could not find container \"77e3069560518cf7547d04f5115f1ac2954fbcdfc16127f87dd2a454676fa241\": container with ID starting with 77e3069560518cf7547d04f5115f1ac2954fbcdfc16127f87dd2a454676fa241 not found: ID does not exist" Mar 18 13:34:16 crc kubenswrapper[4975]: I0318 13:34:16.094736 4975 scope.go:117] "RemoveContainer" containerID="3501889b99580fca390a56d83ad95dc1aaa1d81d3d2cfbd6596bc80dfbb49283" Mar 18 13:34:16 crc kubenswrapper[4975]: E0318 13:34:16.095182 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3501889b99580fca390a56d83ad95dc1aaa1d81d3d2cfbd6596bc80dfbb49283\": container with ID starting with 3501889b99580fca390a56d83ad95dc1aaa1d81d3d2cfbd6596bc80dfbb49283 not found: ID does not exist" containerID="3501889b99580fca390a56d83ad95dc1aaa1d81d3d2cfbd6596bc80dfbb49283" Mar 18 13:34:16 crc kubenswrapper[4975]: I0318 13:34:16.095226 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3501889b99580fca390a56d83ad95dc1aaa1d81d3d2cfbd6596bc80dfbb49283"} err="failed to get container status \"3501889b99580fca390a56d83ad95dc1aaa1d81d3d2cfbd6596bc80dfbb49283\": rpc error: code = NotFound desc = could not find container \"3501889b99580fca390a56d83ad95dc1aaa1d81d3d2cfbd6596bc80dfbb49283\": container with ID starting with 3501889b99580fca390a56d83ad95dc1aaa1d81d3d2cfbd6596bc80dfbb49283 not found: ID does not exist" Mar 18 13:34:16 crc kubenswrapper[4975]: I0318 13:34:16.095255 4975 scope.go:117] "RemoveContainer" containerID="0cad9fadbed3309d51e88ce1cede60759fe83409262b539dbf9106a646ae6a3b" Mar 18 13:34:16 crc kubenswrapper[4975]: E0318 13:34:16.095632 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cad9fadbed3309d51e88ce1cede60759fe83409262b539dbf9106a646ae6a3b\": container with ID starting with 0cad9fadbed3309d51e88ce1cede60759fe83409262b539dbf9106a646ae6a3b not found: ID does not exist" containerID="0cad9fadbed3309d51e88ce1cede60759fe83409262b539dbf9106a646ae6a3b" Mar 18 13:34:16 crc kubenswrapper[4975]: I0318 13:34:16.095657 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cad9fadbed3309d51e88ce1cede60759fe83409262b539dbf9106a646ae6a3b"} err="failed to get container status \"0cad9fadbed3309d51e88ce1cede60759fe83409262b539dbf9106a646ae6a3b\": rpc error: code = NotFound desc = could not find container \"0cad9fadbed3309d51e88ce1cede60759fe83409262b539dbf9106a646ae6a3b\": container with ID starting with 0cad9fadbed3309d51e88ce1cede60759fe83409262b539dbf9106a646ae6a3b not found: ID does not exist" Mar 18 13:34:17 crc kubenswrapper[4975]: I0318 13:34:17.028172 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a6d27eb-fc34-44f5-bd82-87852e47ec71" path="/var/lib/kubelet/pods/8a6d27eb-fc34-44f5-bd82-87852e47ec71/volumes" Mar 18 13:34:18 crc kubenswrapper[4975]: I0318 13:34:18.016539 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:34:18 crc kubenswrapper[4975]: E0318 13:34:18.017181 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:34:29 crc kubenswrapper[4975]: I0318 13:34:29.016360 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:34:29 crc kubenswrapper[4975]: E0318 13:34:29.017331 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:34:41 crc kubenswrapper[4975]: I0318 13:34:41.016229 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:34:41 crc kubenswrapper[4975]: E0318 13:34:41.017124 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:34:45 crc kubenswrapper[4975]: I0318 13:34:45.872648 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6nb6f"] Mar 18 13:34:45 crc kubenswrapper[4975]: E0318 13:34:45.873509 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a6d27eb-fc34-44f5-bd82-87852e47ec71" containerName="extract-content" Mar 18 13:34:45 crc kubenswrapper[4975]: I0318 13:34:45.873522 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6d27eb-fc34-44f5-bd82-87852e47ec71" containerName="extract-content" Mar 18 13:34:45 crc kubenswrapper[4975]: E0318 13:34:45.873538 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39d1114b-18c7-4f16-b805-e96bdb1adbc5" containerName="oc" Mar 18 13:34:45 crc kubenswrapper[4975]: I0318 13:34:45.873561 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d1114b-18c7-4f16-b805-e96bdb1adbc5" containerName="oc" Mar 18 13:34:45 crc kubenswrapper[4975]: E0318 13:34:45.873574 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a6d27eb-fc34-44f5-bd82-87852e47ec71" containerName="registry-server" Mar 18 13:34:45 crc kubenswrapper[4975]: I0318 13:34:45.873580 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6d27eb-fc34-44f5-bd82-87852e47ec71" containerName="registry-server" Mar 18 13:34:45 crc kubenswrapper[4975]: E0318 13:34:45.873598 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a6d27eb-fc34-44f5-bd82-87852e47ec71" containerName="extract-utilities" Mar 18 13:34:45 crc kubenswrapper[4975]: I0318 13:34:45.873604 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6d27eb-fc34-44f5-bd82-87852e47ec71" containerName="extract-utilities" Mar 18 13:34:45 crc kubenswrapper[4975]: I0318 13:34:45.873783 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a6d27eb-fc34-44f5-bd82-87852e47ec71" containerName="registry-server" Mar 18 13:34:45 crc kubenswrapper[4975]: I0318 13:34:45.873801 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="39d1114b-18c7-4f16-b805-e96bdb1adbc5" containerName="oc" Mar 18 13:34:45 crc kubenswrapper[4975]: I0318 13:34:45.875078 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:34:45 crc kubenswrapper[4975]: I0318 13:34:45.881237 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6nb6f"] Mar 18 13:34:45 crc kubenswrapper[4975]: I0318 13:34:45.973030 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghlxm\" (UniqueName: \"kubernetes.io/projected/962501e5-88c6-4229-bb4f-91385cb11bf8-kube-api-access-ghlxm\") pod \"redhat-operators-6nb6f\" (UID: \"962501e5-88c6-4229-bb4f-91385cb11bf8\") " pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:34:45 crc kubenswrapper[4975]: I0318 13:34:45.973325 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/962501e5-88c6-4229-bb4f-91385cb11bf8-catalog-content\") pod \"redhat-operators-6nb6f\" (UID: \"962501e5-88c6-4229-bb4f-91385cb11bf8\") " pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:34:45 crc kubenswrapper[4975]: I0318 13:34:45.973562 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/962501e5-88c6-4229-bb4f-91385cb11bf8-utilities\") pod \"redhat-operators-6nb6f\" (UID: \"962501e5-88c6-4229-bb4f-91385cb11bf8\") " pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:34:46 crc kubenswrapper[4975]: I0318 13:34:46.075082 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/962501e5-88c6-4229-bb4f-91385cb11bf8-utilities\") pod \"redhat-operators-6nb6f\" (UID: \"962501e5-88c6-4229-bb4f-91385cb11bf8\") " pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:34:46 crc kubenswrapper[4975]: I0318 13:34:46.075221 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghlxm\" (UniqueName: \"kubernetes.io/projected/962501e5-88c6-4229-bb4f-91385cb11bf8-kube-api-access-ghlxm\") pod \"redhat-operators-6nb6f\" (UID: \"962501e5-88c6-4229-bb4f-91385cb11bf8\") " pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:34:46 crc kubenswrapper[4975]: I0318 13:34:46.075273 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/962501e5-88c6-4229-bb4f-91385cb11bf8-catalog-content\") pod \"redhat-operators-6nb6f\" (UID: \"962501e5-88c6-4229-bb4f-91385cb11bf8\") " pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:34:46 crc kubenswrapper[4975]: I0318 13:34:46.075995 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/962501e5-88c6-4229-bb4f-91385cb11bf8-catalog-content\") pod \"redhat-operators-6nb6f\" (UID: \"962501e5-88c6-4229-bb4f-91385cb11bf8\") " pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:34:46 crc kubenswrapper[4975]: I0318 13:34:46.075995 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/962501e5-88c6-4229-bb4f-91385cb11bf8-utilities\") pod \"redhat-operators-6nb6f\" (UID: \"962501e5-88c6-4229-bb4f-91385cb11bf8\") " pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:34:46 crc kubenswrapper[4975]: I0318 13:34:46.097439 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghlxm\" (UniqueName: \"kubernetes.io/projected/962501e5-88c6-4229-bb4f-91385cb11bf8-kube-api-access-ghlxm\") pod \"redhat-operators-6nb6f\" (UID: \"962501e5-88c6-4229-bb4f-91385cb11bf8\") " pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:34:46 crc kubenswrapper[4975]: I0318 13:34:46.220443 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:34:46 crc kubenswrapper[4975]: I0318 13:34:46.676536 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6nb6f"] Mar 18 13:34:47 crc kubenswrapper[4975]: I0318 13:34:47.381939 4975 generic.go:334] "Generic (PLEG): container finished" podID="962501e5-88c6-4229-bb4f-91385cb11bf8" containerID="e4627fbd7b409f95f6461fda85e35b23191f88466f0df0ef3d257c9b24f363ad" exitCode=0 Mar 18 13:34:47 crc kubenswrapper[4975]: I0318 13:34:47.382053 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nb6f" event={"ID":"962501e5-88c6-4229-bb4f-91385cb11bf8","Type":"ContainerDied","Data":"e4627fbd7b409f95f6461fda85e35b23191f88466f0df0ef3d257c9b24f363ad"} Mar 18 13:34:47 crc kubenswrapper[4975]: I0318 13:34:47.382195 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nb6f" event={"ID":"962501e5-88c6-4229-bb4f-91385cb11bf8","Type":"ContainerStarted","Data":"9dbf28f038ea0d595efd571a87a8d9fc80af4549467c528690adc44caf75c820"} Mar 18 13:34:48 crc kubenswrapper[4975]: I0318 13:34:48.397430 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nb6f" event={"ID":"962501e5-88c6-4229-bb4f-91385cb11bf8","Type":"ContainerStarted","Data":"9c3d2e4375e7d76f2ea5b85059eec16d73e4f88cefe60d6fabc52d3e03e05bcb"} Mar 18 13:34:49 crc kubenswrapper[4975]: I0318 13:34:49.408420 4975 generic.go:334] "Generic (PLEG): container finished" podID="962501e5-88c6-4229-bb4f-91385cb11bf8" containerID="9c3d2e4375e7d76f2ea5b85059eec16d73e4f88cefe60d6fabc52d3e03e05bcb" exitCode=0 Mar 18 13:34:49 crc kubenswrapper[4975]: I0318 13:34:49.408469 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nb6f" event={"ID":"962501e5-88c6-4229-bb4f-91385cb11bf8","Type":"ContainerDied","Data":"9c3d2e4375e7d76f2ea5b85059eec16d73e4f88cefe60d6fabc52d3e03e05bcb"} Mar 18 13:34:50 crc kubenswrapper[4975]: I0318 13:34:50.276463 4975 scope.go:117] "RemoveContainer" containerID="cd987e02a90893894eaeb1af0cfcc7a0c8f674153c1d3e5a131ef8ceb75215b7" Mar 18 13:34:50 crc kubenswrapper[4975]: I0318 13:34:50.419137 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nb6f" event={"ID":"962501e5-88c6-4229-bb4f-91385cb11bf8","Type":"ContainerStarted","Data":"c34a7813380d1fb5765459354cbc2a5e763de7c6afbed909c4904420b2ab864e"} Mar 18 13:34:50 crc kubenswrapper[4975]: I0318 13:34:50.441556 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6nb6f" podStartSLOduration=3.015350089 podStartE2EDuration="5.44153739s" podCreationTimestamp="2026-03-18 13:34:45 +0000 UTC" firstStartedPulling="2026-03-18 13:34:47.383830051 +0000 UTC m=+5073.098230630" lastFinishedPulling="2026-03-18 13:34:49.810017332 +0000 UTC m=+5075.524417931" observedRunningTime="2026-03-18 13:34:50.435418534 +0000 UTC m=+5076.149819123" watchObservedRunningTime="2026-03-18 13:34:50.44153739 +0000 UTC m=+5076.155937969" Mar 18 13:34:56 crc kubenswrapper[4975]: I0318 13:34:56.016324 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:34:56 crc kubenswrapper[4975]: E0318 13:34:56.017858 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:34:56 crc kubenswrapper[4975]: I0318 13:34:56.221015 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:34:56 crc kubenswrapper[4975]: I0318 13:34:56.221354 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:34:56 crc kubenswrapper[4975]: I0318 13:34:56.714788 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:34:56 crc kubenswrapper[4975]: I0318 13:34:56.766265 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:34:56 crc kubenswrapper[4975]: I0318 13:34:56.957616 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6nb6f"] Mar 18 13:34:58 crc kubenswrapper[4975]: I0318 13:34:58.496568 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6nb6f" podUID="962501e5-88c6-4229-bb4f-91385cb11bf8" containerName="registry-server" containerID="cri-o://c34a7813380d1fb5765459354cbc2a5e763de7c6afbed909c4904420b2ab864e" gracePeriod=2 Mar 18 13:35:00 crc kubenswrapper[4975]: I0318 13:35:00.518420 4975 generic.go:334] "Generic (PLEG): container finished" podID="962501e5-88c6-4229-bb4f-91385cb11bf8" containerID="c34a7813380d1fb5765459354cbc2a5e763de7c6afbed909c4904420b2ab864e" exitCode=0 Mar 18 13:35:00 crc kubenswrapper[4975]: I0318 13:35:00.518617 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nb6f" event={"ID":"962501e5-88c6-4229-bb4f-91385cb11bf8","Type":"ContainerDied","Data":"c34a7813380d1fb5765459354cbc2a5e763de7c6afbed909c4904420b2ab864e"} Mar 18 13:35:00 crc kubenswrapper[4975]: I0318 13:35:00.735953 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:35:00 crc kubenswrapper[4975]: I0318 13:35:00.862843 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/962501e5-88c6-4229-bb4f-91385cb11bf8-catalog-content\") pod \"962501e5-88c6-4229-bb4f-91385cb11bf8\" (UID: \"962501e5-88c6-4229-bb4f-91385cb11bf8\") " Mar 18 13:35:00 crc kubenswrapper[4975]: I0318 13:35:00.863003 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghlxm\" (UniqueName: \"kubernetes.io/projected/962501e5-88c6-4229-bb4f-91385cb11bf8-kube-api-access-ghlxm\") pod \"962501e5-88c6-4229-bb4f-91385cb11bf8\" (UID: \"962501e5-88c6-4229-bb4f-91385cb11bf8\") " Mar 18 13:35:00 crc kubenswrapper[4975]: I0318 13:35:00.863045 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/962501e5-88c6-4229-bb4f-91385cb11bf8-utilities\") pod \"962501e5-88c6-4229-bb4f-91385cb11bf8\" (UID: \"962501e5-88c6-4229-bb4f-91385cb11bf8\") " Mar 18 13:35:00 crc kubenswrapper[4975]: I0318 13:35:00.864048 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962501e5-88c6-4229-bb4f-91385cb11bf8-utilities" (OuterVolumeSpecName: "utilities") pod "962501e5-88c6-4229-bb4f-91385cb11bf8" (UID: "962501e5-88c6-4229-bb4f-91385cb11bf8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:35:00 crc kubenswrapper[4975]: I0318 13:35:00.868269 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962501e5-88c6-4229-bb4f-91385cb11bf8-kube-api-access-ghlxm" (OuterVolumeSpecName: "kube-api-access-ghlxm") pod "962501e5-88c6-4229-bb4f-91385cb11bf8" (UID: "962501e5-88c6-4229-bb4f-91385cb11bf8"). InnerVolumeSpecName "kube-api-access-ghlxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:35:00 crc kubenswrapper[4975]: I0318 13:35:00.965722 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghlxm\" (UniqueName: \"kubernetes.io/projected/962501e5-88c6-4229-bb4f-91385cb11bf8-kube-api-access-ghlxm\") on node \"crc\" DevicePath \"\"" Mar 18 13:35:00 crc kubenswrapper[4975]: I0318 13:35:00.966077 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/962501e5-88c6-4229-bb4f-91385cb11bf8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:35:00 crc kubenswrapper[4975]: I0318 13:35:00.987739 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962501e5-88c6-4229-bb4f-91385cb11bf8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "962501e5-88c6-4229-bb4f-91385cb11bf8" (UID: "962501e5-88c6-4229-bb4f-91385cb11bf8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:35:01 crc kubenswrapper[4975]: I0318 13:35:01.068578 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/962501e5-88c6-4229-bb4f-91385cb11bf8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:35:01 crc kubenswrapper[4975]: I0318 13:35:01.530926 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nb6f" event={"ID":"962501e5-88c6-4229-bb4f-91385cb11bf8","Type":"ContainerDied","Data":"9dbf28f038ea0d595efd571a87a8d9fc80af4549467c528690adc44caf75c820"} Mar 18 13:35:01 crc kubenswrapper[4975]: I0318 13:35:01.530990 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6nb6f" Mar 18 13:35:01 crc kubenswrapper[4975]: I0318 13:35:01.531011 4975 scope.go:117] "RemoveContainer" containerID="c34a7813380d1fb5765459354cbc2a5e763de7c6afbed909c4904420b2ab864e" Mar 18 13:35:01 crc kubenswrapper[4975]: I0318 13:35:01.557992 4975 scope.go:117] "RemoveContainer" containerID="9c3d2e4375e7d76f2ea5b85059eec16d73e4f88cefe60d6fabc52d3e03e05bcb" Mar 18 13:35:01 crc kubenswrapper[4975]: I0318 13:35:01.558048 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6nb6f"] Mar 18 13:35:01 crc kubenswrapper[4975]: I0318 13:35:01.567564 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6nb6f"] Mar 18 13:35:01 crc kubenswrapper[4975]: I0318 13:35:01.581987 4975 scope.go:117] "RemoveContainer" containerID="e4627fbd7b409f95f6461fda85e35b23191f88466f0df0ef3d257c9b24f363ad" Mar 18 13:35:03 crc kubenswrapper[4975]: I0318 13:35:03.029038 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962501e5-88c6-4229-bb4f-91385cb11bf8" path="/var/lib/kubelet/pods/962501e5-88c6-4229-bb4f-91385cb11bf8/volumes" Mar 18 13:35:08 crc kubenswrapper[4975]: I0318 13:35:08.017132 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:35:08 crc kubenswrapper[4975]: E0318 13:35:08.017690 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:35:19 crc kubenswrapper[4975]: I0318 13:35:19.016773 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:35:19 crc kubenswrapper[4975]: E0318 13:35:19.017567 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:35:30 crc kubenswrapper[4975]: I0318 13:35:30.017135 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:35:30 crc kubenswrapper[4975]: E0318 13:35:30.017832 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:35:42 crc kubenswrapper[4975]: I0318 13:35:42.017228 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:35:42 crc kubenswrapper[4975]: E0318 13:35:42.018062 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:35:55 crc kubenswrapper[4975]: I0318 13:35:55.023677 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:35:55 crc kubenswrapper[4975]: E0318 13:35:55.024464 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:36:00 crc kubenswrapper[4975]: I0318 13:36:00.159967 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564016-sbwv4"] Mar 18 13:36:00 crc kubenswrapper[4975]: E0318 13:36:00.160744 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962501e5-88c6-4229-bb4f-91385cb11bf8" containerName="extract-utilities" Mar 18 13:36:00 crc kubenswrapper[4975]: I0318 13:36:00.160756 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="962501e5-88c6-4229-bb4f-91385cb11bf8" containerName="extract-utilities" Mar 18 13:36:00 crc kubenswrapper[4975]: E0318 13:36:00.160771 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962501e5-88c6-4229-bb4f-91385cb11bf8" containerName="registry-server" Mar 18 13:36:00 crc kubenswrapper[4975]: I0318 13:36:00.160777 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="962501e5-88c6-4229-bb4f-91385cb11bf8" containerName="registry-server" Mar 18 13:36:00 crc kubenswrapper[4975]: E0318 13:36:00.160798 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962501e5-88c6-4229-bb4f-91385cb11bf8" containerName="extract-content" Mar 18 13:36:00 crc kubenswrapper[4975]: I0318 13:36:00.160803 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="962501e5-88c6-4229-bb4f-91385cb11bf8" containerName="extract-content" Mar 18 13:36:00 crc kubenswrapper[4975]: I0318 13:36:00.161108 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="962501e5-88c6-4229-bb4f-91385cb11bf8" containerName="registry-server" Mar 18 13:36:00 crc kubenswrapper[4975]: I0318 13:36:00.161715 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-sbwv4" Mar 18 13:36:00 crc kubenswrapper[4975]: I0318 13:36:00.164989 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:36:00 crc kubenswrapper[4975]: I0318 13:36:00.165096 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:36:00 crc kubenswrapper[4975]: I0318 13:36:00.168907 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:36:00 crc kubenswrapper[4975]: I0318 13:36:00.207687 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-sbwv4"] Mar 18 13:36:00 crc kubenswrapper[4975]: I0318 13:36:00.303892 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvlgd\" (UniqueName: \"kubernetes.io/projected/759d4116-47b2-4603-87a7-26a7cb973067-kube-api-access-qvlgd\") pod \"auto-csr-approver-29564016-sbwv4\" (UID: \"759d4116-47b2-4603-87a7-26a7cb973067\") " pod="openshift-infra/auto-csr-approver-29564016-sbwv4" Mar 18 13:36:00 crc kubenswrapper[4975]: I0318 13:36:00.404855 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvlgd\" (UniqueName: \"kubernetes.io/projected/759d4116-47b2-4603-87a7-26a7cb973067-kube-api-access-qvlgd\") pod \"auto-csr-approver-29564016-sbwv4\" (UID: \"759d4116-47b2-4603-87a7-26a7cb973067\") " pod="openshift-infra/auto-csr-approver-29564016-sbwv4" Mar 18 13:36:00 crc kubenswrapper[4975]: I0318 13:36:00.432934 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvlgd\" (UniqueName: \"kubernetes.io/projected/759d4116-47b2-4603-87a7-26a7cb973067-kube-api-access-qvlgd\") pod \"auto-csr-approver-29564016-sbwv4\" (UID: \"759d4116-47b2-4603-87a7-26a7cb973067\") " pod="openshift-infra/auto-csr-approver-29564016-sbwv4" Mar 18 13:36:00 crc kubenswrapper[4975]: I0318 13:36:00.481299 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-sbwv4" Mar 18 13:36:00 crc kubenswrapper[4975]: I0318 13:36:00.905523 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-sbwv4"] Mar 18 13:36:00 crc kubenswrapper[4975]: W0318 13:36:00.911435 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod759d4116_47b2_4603_87a7_26a7cb973067.slice/crio-81675eec30cbb29ba3d44e01d731d852c45088b32e1f396288b412472a7f8906 WatchSource:0}: Error finding container 81675eec30cbb29ba3d44e01d731d852c45088b32e1f396288b412472a7f8906: Status 404 returned error can't find the container with id 81675eec30cbb29ba3d44e01d731d852c45088b32e1f396288b412472a7f8906 Mar 18 13:36:01 crc kubenswrapper[4975]: I0318 13:36:01.081368 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564016-sbwv4" event={"ID":"759d4116-47b2-4603-87a7-26a7cb973067","Type":"ContainerStarted","Data":"81675eec30cbb29ba3d44e01d731d852c45088b32e1f396288b412472a7f8906"} Mar 18 13:36:03 crc kubenswrapper[4975]: I0318 13:36:03.101462 4975 generic.go:334] "Generic (PLEG): container finished" podID="759d4116-47b2-4603-87a7-26a7cb973067" containerID="005a1e7218798b53d7ca57892234586364e80ea09dfcacd41f5b81a465fe65dd" exitCode=0 Mar 18 13:36:03 crc kubenswrapper[4975]: I0318 13:36:03.101561 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564016-sbwv4" event={"ID":"759d4116-47b2-4603-87a7-26a7cb973067","Type":"ContainerDied","Data":"005a1e7218798b53d7ca57892234586364e80ea09dfcacd41f5b81a465fe65dd"} Mar 18 13:36:04 crc kubenswrapper[4975]: I0318 13:36:04.541844 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-sbwv4" Mar 18 13:36:04 crc kubenswrapper[4975]: I0318 13:36:04.585474 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvlgd\" (UniqueName: \"kubernetes.io/projected/759d4116-47b2-4603-87a7-26a7cb973067-kube-api-access-qvlgd\") pod \"759d4116-47b2-4603-87a7-26a7cb973067\" (UID: \"759d4116-47b2-4603-87a7-26a7cb973067\") " Mar 18 13:36:04 crc kubenswrapper[4975]: I0318 13:36:04.594259 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/759d4116-47b2-4603-87a7-26a7cb973067-kube-api-access-qvlgd" (OuterVolumeSpecName: "kube-api-access-qvlgd") pod "759d4116-47b2-4603-87a7-26a7cb973067" (UID: "759d4116-47b2-4603-87a7-26a7cb973067"). InnerVolumeSpecName "kube-api-access-qvlgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:36:04 crc kubenswrapper[4975]: I0318 13:36:04.687001 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvlgd\" (UniqueName: \"kubernetes.io/projected/759d4116-47b2-4603-87a7-26a7cb973067-kube-api-access-qvlgd\") on node \"crc\" DevicePath \"\"" Mar 18 13:36:05 crc kubenswrapper[4975]: I0318 13:36:05.126130 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564016-sbwv4" event={"ID":"759d4116-47b2-4603-87a7-26a7cb973067","Type":"ContainerDied","Data":"81675eec30cbb29ba3d44e01d731d852c45088b32e1f396288b412472a7f8906"} Mar 18 13:36:05 crc kubenswrapper[4975]: I0318 13:36:05.126185 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81675eec30cbb29ba3d44e01d731d852c45088b32e1f396288b412472a7f8906" Mar 18 13:36:05 crc kubenswrapper[4975]: I0318 13:36:05.126272 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564016-sbwv4" Mar 18 13:36:05 crc kubenswrapper[4975]: I0318 13:36:05.607657 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-2q2xg"] Mar 18 13:36:05 crc kubenswrapper[4975]: I0318 13:36:05.617334 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564010-2q2xg"] Mar 18 13:36:07 crc kubenswrapper[4975]: I0318 13:36:07.031098 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da209d13-af65-4c07-8823-fcce63dc995a" path="/var/lib/kubelet/pods/da209d13-af65-4c07-8823-fcce63dc995a/volumes" Mar 18 13:36:08 crc kubenswrapper[4975]: I0318 13:36:08.016605 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:36:08 crc kubenswrapper[4975]: E0318 13:36:08.017288 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:36:21 crc kubenswrapper[4975]: I0318 13:36:21.017285 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:36:21 crc kubenswrapper[4975]: E0318 13:36:21.019395 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:36:36 crc kubenswrapper[4975]: I0318 13:36:36.016613 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:36:36 crc kubenswrapper[4975]: E0318 13:36:36.018512 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:36:48 crc kubenswrapper[4975]: I0318 13:36:48.017194 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:36:48 crc kubenswrapper[4975]: E0318 13:36:48.019582 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:36:50 crc kubenswrapper[4975]: I0318 13:36:50.400291 4975 scope.go:117] "RemoveContainer" containerID="9c8f6fe8823b8122357b312374376a51162d499f12728901d20dcef9bc9e3690" Mar 18 13:37:03 crc kubenswrapper[4975]: I0318 13:37:03.017156 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:37:03 crc kubenswrapper[4975]: E0318 13:37:03.017979 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:37:17 crc kubenswrapper[4975]: I0318 13:37:17.018270 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:37:17 crc kubenswrapper[4975]: E0318 13:37:17.019410 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:37:32 crc kubenswrapper[4975]: I0318 13:37:32.016645 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:37:32 crc kubenswrapper[4975]: E0318 13:37:32.017381 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:37:43 crc kubenswrapper[4975]: I0318 13:37:43.016935 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:37:43 crc kubenswrapper[4975]: E0318 13:37:43.017705 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:37:58 crc kubenswrapper[4975]: I0318 13:37:58.016649 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:37:58 crc kubenswrapper[4975]: E0318 13:37:58.017344 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:38:00 crc kubenswrapper[4975]: I0318 13:38:00.147284 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564018-pgqzn"] Mar 18 13:38:00 crc kubenswrapper[4975]: E0318 13:38:00.148010 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759d4116-47b2-4603-87a7-26a7cb973067" containerName="oc" Mar 18 13:38:00 crc kubenswrapper[4975]: I0318 13:38:00.148024 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="759d4116-47b2-4603-87a7-26a7cb973067" containerName="oc" Mar 18 13:38:00 crc kubenswrapper[4975]: I0318 13:38:00.148223 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="759d4116-47b2-4603-87a7-26a7cb973067" containerName="oc" Mar 18 13:38:00 crc kubenswrapper[4975]: I0318 13:38:00.148799 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-pgqzn" Mar 18 13:38:00 crc kubenswrapper[4975]: I0318 13:38:00.151986 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:38:00 crc kubenswrapper[4975]: I0318 13:38:00.153082 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:38:00 crc kubenswrapper[4975]: I0318 13:38:00.153331 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:38:00 crc kubenswrapper[4975]: I0318 13:38:00.161625 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-pgqzn"] Mar 18 13:38:00 crc kubenswrapper[4975]: I0318 13:38:00.254416 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9xs5\" (UniqueName: \"kubernetes.io/projected/c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8-kube-api-access-q9xs5\") pod \"auto-csr-approver-29564018-pgqzn\" (UID: \"c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8\") " pod="openshift-infra/auto-csr-approver-29564018-pgqzn" Mar 18 13:38:00 crc kubenswrapper[4975]: I0318 13:38:00.356496 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9xs5\" (UniqueName: \"kubernetes.io/projected/c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8-kube-api-access-q9xs5\") pod \"auto-csr-approver-29564018-pgqzn\" (UID: \"c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8\") " pod="openshift-infra/auto-csr-approver-29564018-pgqzn" Mar 18 13:38:00 crc kubenswrapper[4975]: I0318 13:38:00.377527 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9xs5\" (UniqueName: \"kubernetes.io/projected/c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8-kube-api-access-q9xs5\") pod \"auto-csr-approver-29564018-pgqzn\" (UID: \"c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8\") " pod="openshift-infra/auto-csr-approver-29564018-pgqzn" Mar 18 13:38:00 crc kubenswrapper[4975]: I0318 13:38:00.471344 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-pgqzn" Mar 18 13:38:01 crc kubenswrapper[4975]: I0318 13:38:01.008824 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-pgqzn"] Mar 18 13:38:01 crc kubenswrapper[4975]: I0318 13:38:01.014775 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:38:01 crc kubenswrapper[4975]: I0318 13:38:01.160559 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564018-pgqzn" event={"ID":"c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8","Type":"ContainerStarted","Data":"69e13395bf022f208a2e63ff0703ef74559ffe8f79c510a064d7fdf8f5c6e8d3"} Mar 18 13:38:03 crc kubenswrapper[4975]: I0318 13:38:03.179937 4975 generic.go:334] "Generic (PLEG): container finished" podID="c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8" containerID="f0d80b42771e33fac4c6daeb9c7eea4caf5c8d466617b289ae41f4b50eb9a5c6" exitCode=0 Mar 18 13:38:03 crc kubenswrapper[4975]: I0318 13:38:03.180370 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564018-pgqzn" event={"ID":"c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8","Type":"ContainerDied","Data":"f0d80b42771e33fac4c6daeb9c7eea4caf5c8d466617b289ae41f4b50eb9a5c6"} Mar 18 13:38:04 crc kubenswrapper[4975]: I0318 13:38:04.530683 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-pgqzn" Mar 18 13:38:04 crc kubenswrapper[4975]: I0318 13:38:04.636595 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9xs5\" (UniqueName: \"kubernetes.io/projected/c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8-kube-api-access-q9xs5\") pod \"c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8\" (UID: \"c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8\") " Mar 18 13:38:04 crc kubenswrapper[4975]: I0318 13:38:04.643469 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8-kube-api-access-q9xs5" (OuterVolumeSpecName: "kube-api-access-q9xs5") pod "c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8" (UID: "c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8"). InnerVolumeSpecName "kube-api-access-q9xs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:38:04 crc kubenswrapper[4975]: I0318 13:38:04.738395 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9xs5\" (UniqueName: \"kubernetes.io/projected/c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8-kube-api-access-q9xs5\") on node \"crc\" DevicePath \"\"" Mar 18 13:38:05 crc kubenswrapper[4975]: I0318 13:38:05.196148 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564018-pgqzn" event={"ID":"c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8","Type":"ContainerDied","Data":"69e13395bf022f208a2e63ff0703ef74559ffe8f79c510a064d7fdf8f5c6e8d3"} Mar 18 13:38:05 crc kubenswrapper[4975]: I0318 13:38:05.196207 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69e13395bf022f208a2e63ff0703ef74559ffe8f79c510a064d7fdf8f5c6e8d3" Mar 18 13:38:05 crc kubenswrapper[4975]: I0318 13:38:05.196285 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564018-pgqzn" Mar 18 13:38:05 crc kubenswrapper[4975]: I0318 13:38:05.625212 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-czjk6"] Mar 18 13:38:05 crc kubenswrapper[4975]: I0318 13:38:05.635121 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564012-czjk6"] Mar 18 13:38:07 crc kubenswrapper[4975]: I0318 13:38:07.028903 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e17f683f-43ee-45bf-9e94-5f550c2d8449" path="/var/lib/kubelet/pods/e17f683f-43ee-45bf-9e94-5f550c2d8449/volumes" Mar 18 13:38:11 crc kubenswrapper[4975]: I0318 13:38:11.016324 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:38:11 crc kubenswrapper[4975]: E0318 13:38:11.017227 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:38:23 crc kubenswrapper[4975]: I0318 13:38:23.036058 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:38:23 crc kubenswrapper[4975]: E0318 13:38:23.039124 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:38:31 crc kubenswrapper[4975]: I0318 13:38:31.306308 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zv9b4/must-gather-njdn8"] Mar 18 13:38:31 crc kubenswrapper[4975]: E0318 13:38:31.307368 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8" containerName="oc" Mar 18 13:38:31 crc kubenswrapper[4975]: I0318 13:38:31.307386 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8" containerName="oc" Mar 18 13:38:31 crc kubenswrapper[4975]: I0318 13:38:31.307618 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8" containerName="oc" Mar 18 13:38:31 crc kubenswrapper[4975]: I0318 13:38:31.308777 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zv9b4/must-gather-njdn8" Mar 18 13:38:31 crc kubenswrapper[4975]: I0318 13:38:31.312706 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zv9b4"/"openshift-service-ca.crt" Mar 18 13:38:31 crc kubenswrapper[4975]: I0318 13:38:31.320578 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zv9b4"/"kube-root-ca.crt" Mar 18 13:38:31 crc kubenswrapper[4975]: I0318 13:38:31.334418 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zv9b4/must-gather-njdn8"] Mar 18 13:38:31 crc kubenswrapper[4975]: I0318 13:38:31.490343 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4-must-gather-output\") pod \"must-gather-njdn8\" (UID: \"c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4\") " pod="openshift-must-gather-zv9b4/must-gather-njdn8" Mar 18 13:38:31 crc kubenswrapper[4975]: I0318 13:38:31.490414 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkwgq\" (UniqueName: \"kubernetes.io/projected/c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4-kube-api-access-fkwgq\") pod \"must-gather-njdn8\" (UID: \"c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4\") " pod="openshift-must-gather-zv9b4/must-gather-njdn8" Mar 18 13:38:31 crc kubenswrapper[4975]: I0318 13:38:31.592556 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4-must-gather-output\") pod \"must-gather-njdn8\" (UID: \"c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4\") " pod="openshift-must-gather-zv9b4/must-gather-njdn8" Mar 18 13:38:31 crc kubenswrapper[4975]: I0318 13:38:31.592633 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkwgq\" (UniqueName: \"kubernetes.io/projected/c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4-kube-api-access-fkwgq\") pod \"must-gather-njdn8\" (UID: \"c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4\") " pod="openshift-must-gather-zv9b4/must-gather-njdn8" Mar 18 13:38:31 crc kubenswrapper[4975]: I0318 13:38:31.593357 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4-must-gather-output\") pod \"must-gather-njdn8\" (UID: \"c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4\") " pod="openshift-must-gather-zv9b4/must-gather-njdn8" Mar 18 13:38:31 crc kubenswrapper[4975]: I0318 13:38:31.962733 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkwgq\" (UniqueName: \"kubernetes.io/projected/c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4-kube-api-access-fkwgq\") pod \"must-gather-njdn8\" (UID: \"c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4\") " pod="openshift-must-gather-zv9b4/must-gather-njdn8" Mar 18 13:38:32 crc kubenswrapper[4975]: I0318 13:38:32.229337 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zv9b4/must-gather-njdn8" Mar 18 13:38:32 crc kubenswrapper[4975]: I0318 13:38:32.719978 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zv9b4/must-gather-njdn8"] Mar 18 13:38:33 crc kubenswrapper[4975]: I0318 13:38:33.454407 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zv9b4/must-gather-njdn8" event={"ID":"c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4","Type":"ContainerStarted","Data":"61c6cbb8c02a6c6c357362673030a1f20c1a7922c8b53fdc57c3a78ae10deefb"} Mar 18 13:38:35 crc kubenswrapper[4975]: I0318 13:38:35.026572 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:38:35 crc kubenswrapper[4975]: E0318 13:38:35.028109 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:38:40 crc kubenswrapper[4975]: I0318 13:38:40.554851 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zv9b4/must-gather-njdn8" event={"ID":"c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4","Type":"ContainerStarted","Data":"9e11e0919b24d2816d47f9184abb539d73c7e076ad371ef0023458a28c5769ef"} Mar 18 13:38:41 crc kubenswrapper[4975]: I0318 13:38:41.564771 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zv9b4/must-gather-njdn8" event={"ID":"c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4","Type":"ContainerStarted","Data":"100fcbe7cf8ba8b57e0f9cd5862ddae41cd84b074891bb155a1775335fbc8db8"} Mar 18 13:38:41 crc kubenswrapper[4975]: I0318 13:38:41.582025 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zv9b4/must-gather-njdn8" podStartSLOduration=3.227775543 podStartE2EDuration="10.581998344s" podCreationTimestamp="2026-03-18 13:38:31 +0000 UTC" firstStartedPulling="2026-03-18 13:38:32.701512694 +0000 UTC m=+5298.415913273" lastFinishedPulling="2026-03-18 13:38:40.055735495 +0000 UTC m=+5305.770136074" observedRunningTime="2026-03-18 13:38:41.580915885 +0000 UTC m=+5307.295316484" watchObservedRunningTime="2026-03-18 13:38:41.581998344 +0000 UTC m=+5307.296398923" Mar 18 13:38:44 crc kubenswrapper[4975]: I0318 13:38:44.947777 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zv9b4/crc-debug-x5bkc"] Mar 18 13:38:44 crc kubenswrapper[4975]: I0318 13:38:44.949707 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zv9b4/crc-debug-x5bkc" Mar 18 13:38:44 crc kubenswrapper[4975]: I0318 13:38:44.952353 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zv9b4"/"default-dockercfg-cgpdh" Mar 18 13:38:44 crc kubenswrapper[4975]: I0318 13:38:44.992261 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcwdh\" (UniqueName: \"kubernetes.io/projected/93e2dd16-76d9-4488-a25c-04c4f5947075-kube-api-access-dcwdh\") pod \"crc-debug-x5bkc\" (UID: \"93e2dd16-76d9-4488-a25c-04c4f5947075\") " pod="openshift-must-gather-zv9b4/crc-debug-x5bkc" Mar 18 13:38:44 crc kubenswrapper[4975]: I0318 13:38:44.992370 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93e2dd16-76d9-4488-a25c-04c4f5947075-host\") pod \"crc-debug-x5bkc\" (UID: \"93e2dd16-76d9-4488-a25c-04c4f5947075\") " pod="openshift-must-gather-zv9b4/crc-debug-x5bkc" Mar 18 13:38:45 crc kubenswrapper[4975]: I0318 13:38:45.094362 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93e2dd16-76d9-4488-a25c-04c4f5947075-host\") pod \"crc-debug-x5bkc\" (UID: \"93e2dd16-76d9-4488-a25c-04c4f5947075\") " pod="openshift-must-gather-zv9b4/crc-debug-x5bkc" Mar 18 13:38:45 crc kubenswrapper[4975]: I0318 13:38:45.094595 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcwdh\" (UniqueName: \"kubernetes.io/projected/93e2dd16-76d9-4488-a25c-04c4f5947075-kube-api-access-dcwdh\") pod \"crc-debug-x5bkc\" (UID: \"93e2dd16-76d9-4488-a25c-04c4f5947075\") " pod="openshift-must-gather-zv9b4/crc-debug-x5bkc" Mar 18 13:38:45 crc kubenswrapper[4975]: I0318 13:38:45.095032 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93e2dd16-76d9-4488-a25c-04c4f5947075-host\") pod \"crc-debug-x5bkc\" (UID: \"93e2dd16-76d9-4488-a25c-04c4f5947075\") " pod="openshift-must-gather-zv9b4/crc-debug-x5bkc" Mar 18 13:38:45 crc kubenswrapper[4975]: I0318 13:38:45.124148 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcwdh\" (UniqueName: \"kubernetes.io/projected/93e2dd16-76d9-4488-a25c-04c4f5947075-kube-api-access-dcwdh\") pod \"crc-debug-x5bkc\" (UID: \"93e2dd16-76d9-4488-a25c-04c4f5947075\") " pod="openshift-must-gather-zv9b4/crc-debug-x5bkc" Mar 18 13:38:45 crc kubenswrapper[4975]: I0318 13:38:45.270153 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zv9b4/crc-debug-x5bkc" Mar 18 13:38:45 crc kubenswrapper[4975]: W0318 13:38:45.326007 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93e2dd16_76d9_4488_a25c_04c4f5947075.slice/crio-8381f06e27f15829435677f4a12d43b4f0e028c4987c9ee10671366937d88357 WatchSource:0}: Error finding container 8381f06e27f15829435677f4a12d43b4f0e028c4987c9ee10671366937d88357: Status 404 returned error can't find the container with id 8381f06e27f15829435677f4a12d43b4f0e028c4987c9ee10671366937d88357 Mar 18 13:38:45 crc kubenswrapper[4975]: I0318 13:38:45.598419 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zv9b4/crc-debug-x5bkc" event={"ID":"93e2dd16-76d9-4488-a25c-04c4f5947075","Type":"ContainerStarted","Data":"8381f06e27f15829435677f4a12d43b4f0e028c4987c9ee10671366937d88357"} Mar 18 13:38:49 crc kubenswrapper[4975]: I0318 13:38:49.016172 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:38:49 crc kubenswrapper[4975]: E0318 13:38:49.016955 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:38:50 crc kubenswrapper[4975]: I0318 13:38:50.552000 4975 scope.go:117] "RemoveContainer" containerID="0c0e58f0585a4ab9a189c65a28ffd44118e9fd1bd2ba9494af1f4b003c11cdca" Mar 18 13:38:54 crc kubenswrapper[4975]: I0318 13:38:54.691259 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zv9b4/crc-debug-x5bkc" event={"ID":"93e2dd16-76d9-4488-a25c-04c4f5947075","Type":"ContainerStarted","Data":"688923cd39a233f335ad5d00f585bd1c4e58d71015c82e2a7b683ac73969ffbd"} Mar 18 13:38:54 crc kubenswrapper[4975]: I0318 13:38:54.706232 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zv9b4/crc-debug-x5bkc" podStartSLOduration=1.729890969 podStartE2EDuration="10.70620981s" podCreationTimestamp="2026-03-18 13:38:44 +0000 UTC" firstStartedPulling="2026-03-18 13:38:45.33219791 +0000 UTC m=+5311.046598489" lastFinishedPulling="2026-03-18 13:38:54.308516741 +0000 UTC m=+5320.022917330" observedRunningTime="2026-03-18 13:38:54.705582053 +0000 UTC m=+5320.419982632" watchObservedRunningTime="2026-03-18 13:38:54.70620981 +0000 UTC m=+5320.420610389" Mar 18 13:39:00 crc kubenswrapper[4975]: I0318 13:39:00.016768 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:39:01 crc kubenswrapper[4975]: I0318 13:39:01.754812 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"e4090f30bdc2c45c39660d9b590c2d973aeca4e258919be7c997a1fe3dc3e461"} Mar 18 13:39:09 crc kubenswrapper[4975]: I0318 13:39:09.823659 4975 generic.go:334] "Generic (PLEG): container finished" podID="93e2dd16-76d9-4488-a25c-04c4f5947075" containerID="688923cd39a233f335ad5d00f585bd1c4e58d71015c82e2a7b683ac73969ffbd" exitCode=0 Mar 18 13:39:09 crc kubenswrapper[4975]: I0318 13:39:09.823741 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zv9b4/crc-debug-x5bkc" event={"ID":"93e2dd16-76d9-4488-a25c-04c4f5947075","Type":"ContainerDied","Data":"688923cd39a233f335ad5d00f585bd1c4e58d71015c82e2a7b683ac73969ffbd"} Mar 18 13:39:11 crc kubenswrapper[4975]: I0318 13:39:11.158955 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zv9b4/crc-debug-x5bkc" Mar 18 13:39:11 crc kubenswrapper[4975]: I0318 13:39:11.192759 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zv9b4/crc-debug-x5bkc"] Mar 18 13:39:11 crc kubenswrapper[4975]: I0318 13:39:11.200848 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zv9b4/crc-debug-x5bkc"] Mar 18 13:39:11 crc kubenswrapper[4975]: I0318 13:39:11.297381 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93e2dd16-76d9-4488-a25c-04c4f5947075-host\") pod \"93e2dd16-76d9-4488-a25c-04c4f5947075\" (UID: \"93e2dd16-76d9-4488-a25c-04c4f5947075\") " Mar 18 13:39:11 crc kubenswrapper[4975]: I0318 13:39:11.297527 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93e2dd16-76d9-4488-a25c-04c4f5947075-host" (OuterVolumeSpecName: "host") pod "93e2dd16-76d9-4488-a25c-04c4f5947075" (UID: "93e2dd16-76d9-4488-a25c-04c4f5947075"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:39:11 crc kubenswrapper[4975]: I0318 13:39:11.297799 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcwdh\" (UniqueName: \"kubernetes.io/projected/93e2dd16-76d9-4488-a25c-04c4f5947075-kube-api-access-dcwdh\") pod \"93e2dd16-76d9-4488-a25c-04c4f5947075\" (UID: \"93e2dd16-76d9-4488-a25c-04c4f5947075\") " Mar 18 13:39:11 crc kubenswrapper[4975]: I0318 13:39:11.298856 4975 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93e2dd16-76d9-4488-a25c-04c4f5947075-host\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:11 crc kubenswrapper[4975]: I0318 13:39:11.306278 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93e2dd16-76d9-4488-a25c-04c4f5947075-kube-api-access-dcwdh" (OuterVolumeSpecName: "kube-api-access-dcwdh") pod "93e2dd16-76d9-4488-a25c-04c4f5947075" (UID: "93e2dd16-76d9-4488-a25c-04c4f5947075"). InnerVolumeSpecName "kube-api-access-dcwdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:39:11 crc kubenswrapper[4975]: I0318 13:39:11.401401 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcwdh\" (UniqueName: \"kubernetes.io/projected/93e2dd16-76d9-4488-a25c-04c4f5947075-kube-api-access-dcwdh\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:11 crc kubenswrapper[4975]: I0318 13:39:11.847495 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8381f06e27f15829435677f4a12d43b4f0e028c4987c9ee10671366937d88357" Mar 18 13:39:11 crc kubenswrapper[4975]: I0318 13:39:11.847555 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zv9b4/crc-debug-x5bkc" Mar 18 13:39:12 crc kubenswrapper[4975]: I0318 13:39:12.391637 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zv9b4/crc-debug-r8ssk"] Mar 18 13:39:12 crc kubenswrapper[4975]: E0318 13:39:12.392826 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93e2dd16-76d9-4488-a25c-04c4f5947075" containerName="container-00" Mar 18 13:39:12 crc kubenswrapper[4975]: I0318 13:39:12.392849 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e2dd16-76d9-4488-a25c-04c4f5947075" containerName="container-00" Mar 18 13:39:12 crc kubenswrapper[4975]: I0318 13:39:12.393440 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="93e2dd16-76d9-4488-a25c-04c4f5947075" containerName="container-00" Mar 18 13:39:12 crc kubenswrapper[4975]: I0318 13:39:12.394786 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zv9b4/crc-debug-r8ssk" Mar 18 13:39:12 crc kubenswrapper[4975]: I0318 13:39:12.399570 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zv9b4"/"default-dockercfg-cgpdh" Mar 18 13:39:12 crc kubenswrapper[4975]: I0318 13:39:12.520535 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e3edd3d8-e83a-42be-93bc-9d913cb73105-host\") pod \"crc-debug-r8ssk\" (UID: \"e3edd3d8-e83a-42be-93bc-9d913cb73105\") " pod="openshift-must-gather-zv9b4/crc-debug-r8ssk" Mar 18 13:39:12 crc kubenswrapper[4975]: I0318 13:39:12.520603 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdpsp\" (UniqueName: \"kubernetes.io/projected/e3edd3d8-e83a-42be-93bc-9d913cb73105-kube-api-access-fdpsp\") pod \"crc-debug-r8ssk\" (UID: \"e3edd3d8-e83a-42be-93bc-9d913cb73105\") " pod="openshift-must-gather-zv9b4/crc-debug-r8ssk" Mar 18 13:39:12 crc kubenswrapper[4975]: I0318 13:39:12.622910 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e3edd3d8-e83a-42be-93bc-9d913cb73105-host\") pod \"crc-debug-r8ssk\" (UID: \"e3edd3d8-e83a-42be-93bc-9d913cb73105\") " pod="openshift-must-gather-zv9b4/crc-debug-r8ssk" Mar 18 13:39:12 crc kubenswrapper[4975]: I0318 13:39:12.623077 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdpsp\" (UniqueName: \"kubernetes.io/projected/e3edd3d8-e83a-42be-93bc-9d913cb73105-kube-api-access-fdpsp\") pod \"crc-debug-r8ssk\" (UID: \"e3edd3d8-e83a-42be-93bc-9d913cb73105\") " pod="openshift-must-gather-zv9b4/crc-debug-r8ssk" Mar 18 13:39:12 crc kubenswrapper[4975]: I0318 13:39:12.623147 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e3edd3d8-e83a-42be-93bc-9d913cb73105-host\") pod \"crc-debug-r8ssk\" (UID: \"e3edd3d8-e83a-42be-93bc-9d913cb73105\") " pod="openshift-must-gather-zv9b4/crc-debug-r8ssk" Mar 18 13:39:12 crc kubenswrapper[4975]: I0318 13:39:12.971815 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdpsp\" (UniqueName: \"kubernetes.io/projected/e3edd3d8-e83a-42be-93bc-9d913cb73105-kube-api-access-fdpsp\") pod \"crc-debug-r8ssk\" (UID: \"e3edd3d8-e83a-42be-93bc-9d913cb73105\") " pod="openshift-must-gather-zv9b4/crc-debug-r8ssk" Mar 18 13:39:13 crc kubenswrapper[4975]: I0318 13:39:13.018632 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zv9b4/crc-debug-r8ssk" Mar 18 13:39:13 crc kubenswrapper[4975]: I0318 13:39:13.030331 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93e2dd16-76d9-4488-a25c-04c4f5947075" path="/var/lib/kubelet/pods/93e2dd16-76d9-4488-a25c-04c4f5947075/volumes" Mar 18 13:39:13 crc kubenswrapper[4975]: I0318 13:39:13.867902 4975 generic.go:334] "Generic (PLEG): container finished" podID="e3edd3d8-e83a-42be-93bc-9d913cb73105" containerID="7e428cc6662e32e41a1227e2449c8a53ce7f4bd4d6aff070f9a2ed75b92400ba" exitCode=1 Mar 18 13:39:13 crc kubenswrapper[4975]: I0318 13:39:13.867999 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zv9b4/crc-debug-r8ssk" event={"ID":"e3edd3d8-e83a-42be-93bc-9d913cb73105","Type":"ContainerDied","Data":"7e428cc6662e32e41a1227e2449c8a53ce7f4bd4d6aff070f9a2ed75b92400ba"} Mar 18 13:39:13 crc kubenswrapper[4975]: I0318 13:39:13.868392 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zv9b4/crc-debug-r8ssk" event={"ID":"e3edd3d8-e83a-42be-93bc-9d913cb73105","Type":"ContainerStarted","Data":"f3c445525840f5aa75cad2aac202736d67b4616d3119fcb32d3e98ea604e1989"} Mar 18 13:39:13 crc kubenswrapper[4975]: I0318 13:39:13.916271 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zv9b4/crc-debug-r8ssk"] Mar 18 13:39:13 crc kubenswrapper[4975]: I0318 13:39:13.928684 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zv9b4/crc-debug-r8ssk"] Mar 18 13:39:14 crc kubenswrapper[4975]: I0318 13:39:14.981306 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zv9b4/crc-debug-r8ssk" Mar 18 13:39:15 crc kubenswrapper[4975]: I0318 13:39:15.065256 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdpsp\" (UniqueName: \"kubernetes.io/projected/e3edd3d8-e83a-42be-93bc-9d913cb73105-kube-api-access-fdpsp\") pod \"e3edd3d8-e83a-42be-93bc-9d913cb73105\" (UID: \"e3edd3d8-e83a-42be-93bc-9d913cb73105\") " Mar 18 13:39:15 crc kubenswrapper[4975]: I0318 13:39:15.065321 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e3edd3d8-e83a-42be-93bc-9d913cb73105-host\") pod \"e3edd3d8-e83a-42be-93bc-9d913cb73105\" (UID: \"e3edd3d8-e83a-42be-93bc-9d913cb73105\") " Mar 18 13:39:15 crc kubenswrapper[4975]: I0318 13:39:15.065881 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3edd3d8-e83a-42be-93bc-9d913cb73105-host" (OuterVolumeSpecName: "host") pod "e3edd3d8-e83a-42be-93bc-9d913cb73105" (UID: "e3edd3d8-e83a-42be-93bc-9d913cb73105"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 13:39:15 crc kubenswrapper[4975]: I0318 13:39:15.071532 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3edd3d8-e83a-42be-93bc-9d913cb73105-kube-api-access-fdpsp" (OuterVolumeSpecName: "kube-api-access-fdpsp") pod "e3edd3d8-e83a-42be-93bc-9d913cb73105" (UID: "e3edd3d8-e83a-42be-93bc-9d913cb73105"). InnerVolumeSpecName "kube-api-access-fdpsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:39:15 crc kubenswrapper[4975]: I0318 13:39:15.167252 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdpsp\" (UniqueName: \"kubernetes.io/projected/e3edd3d8-e83a-42be-93bc-9d913cb73105-kube-api-access-fdpsp\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:15 crc kubenswrapper[4975]: I0318 13:39:15.167924 4975 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e3edd3d8-e83a-42be-93bc-9d913cb73105-host\") on node \"crc\" DevicePath \"\"" Mar 18 13:39:15 crc kubenswrapper[4975]: I0318 13:39:15.905040 4975 scope.go:117] "RemoveContainer" containerID="7e428cc6662e32e41a1227e2449c8a53ce7f4bd4d6aff070f9a2ed75b92400ba" Mar 18 13:39:15 crc kubenswrapper[4975]: I0318 13:39:15.905079 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zv9b4/crc-debug-r8ssk" Mar 18 13:39:17 crc kubenswrapper[4975]: I0318 13:39:17.026849 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3edd3d8-e83a-42be-93bc-9d913cb73105" path="/var/lib/kubelet/pods/e3edd3d8-e83a-42be-93bc-9d913cb73105/volumes" Mar 18 13:40:00 crc kubenswrapper[4975]: I0318 13:40:00.158418 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564020-wlcfg"] Mar 18 13:40:00 crc kubenswrapper[4975]: E0318 13:40:00.159331 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3edd3d8-e83a-42be-93bc-9d913cb73105" containerName="container-00" Mar 18 13:40:00 crc kubenswrapper[4975]: I0318 13:40:00.159344 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3edd3d8-e83a-42be-93bc-9d913cb73105" containerName="container-00" Mar 18 13:40:00 crc kubenswrapper[4975]: I0318 13:40:00.159542 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3edd3d8-e83a-42be-93bc-9d913cb73105" containerName="container-00" Mar 18 13:40:00 crc kubenswrapper[4975]: I0318 13:40:00.160248 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-wlcfg" Mar 18 13:40:00 crc kubenswrapper[4975]: I0318 13:40:00.162960 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:40:00 crc kubenswrapper[4975]: I0318 13:40:00.163100 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:40:00 crc kubenswrapper[4975]: I0318 13:40:00.163126 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:40:00 crc kubenswrapper[4975]: I0318 13:40:00.171766 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-wlcfg"] Mar 18 13:40:00 crc kubenswrapper[4975]: I0318 13:40:00.286611 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qjj6\" (UniqueName: \"kubernetes.io/projected/a4b80ae8-52f0-493d-a760-3b1304bb849a-kube-api-access-9qjj6\") pod \"auto-csr-approver-29564020-wlcfg\" (UID: \"a4b80ae8-52f0-493d-a760-3b1304bb849a\") " pod="openshift-infra/auto-csr-approver-29564020-wlcfg" Mar 18 13:40:00 crc kubenswrapper[4975]: I0318 13:40:00.388915 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qjj6\" (UniqueName: \"kubernetes.io/projected/a4b80ae8-52f0-493d-a760-3b1304bb849a-kube-api-access-9qjj6\") pod \"auto-csr-approver-29564020-wlcfg\" (UID: \"a4b80ae8-52f0-493d-a760-3b1304bb849a\") " pod="openshift-infra/auto-csr-approver-29564020-wlcfg" Mar 18 13:40:00 crc kubenswrapper[4975]: I0318 13:40:00.410473 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qjj6\" (UniqueName: \"kubernetes.io/projected/a4b80ae8-52f0-493d-a760-3b1304bb849a-kube-api-access-9qjj6\") pod \"auto-csr-approver-29564020-wlcfg\" (UID: \"a4b80ae8-52f0-493d-a760-3b1304bb849a\") " pod="openshift-infra/auto-csr-approver-29564020-wlcfg" Mar 18 13:40:00 crc kubenswrapper[4975]: I0318 13:40:00.511178 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-wlcfg" Mar 18 13:40:01 crc kubenswrapper[4975]: I0318 13:40:01.002103 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-wlcfg"] Mar 18 13:40:01 crc kubenswrapper[4975]: I0318 13:40:01.360802 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564020-wlcfg" event={"ID":"a4b80ae8-52f0-493d-a760-3b1304bb849a","Type":"ContainerStarted","Data":"41912d982f7256e0f4a11533ce19e4b2cdb53e9e8936c567e5da7482840d23bf"} Mar 18 13:40:03 crc kubenswrapper[4975]: I0318 13:40:03.380522 4975 generic.go:334] "Generic (PLEG): container finished" podID="a4b80ae8-52f0-493d-a760-3b1304bb849a" containerID="b955af01730215f1717705878df49daa195f8bae192cc70dc8a138723ac71e69" exitCode=0 Mar 18 13:40:03 crc kubenswrapper[4975]: I0318 13:40:03.380625 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564020-wlcfg" event={"ID":"a4b80ae8-52f0-493d-a760-3b1304bb849a","Type":"ContainerDied","Data":"b955af01730215f1717705878df49daa195f8bae192cc70dc8a138723ac71e69"} Mar 18 13:40:04 crc kubenswrapper[4975]: I0318 13:40:04.725314 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-wlcfg" Mar 18 13:40:04 crc kubenswrapper[4975]: I0318 13:40:04.893428 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qjj6\" (UniqueName: \"kubernetes.io/projected/a4b80ae8-52f0-493d-a760-3b1304bb849a-kube-api-access-9qjj6\") pod \"a4b80ae8-52f0-493d-a760-3b1304bb849a\" (UID: \"a4b80ae8-52f0-493d-a760-3b1304bb849a\") " Mar 18 13:40:04 crc kubenswrapper[4975]: I0318 13:40:04.899770 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4b80ae8-52f0-493d-a760-3b1304bb849a-kube-api-access-9qjj6" (OuterVolumeSpecName: "kube-api-access-9qjj6") pod "a4b80ae8-52f0-493d-a760-3b1304bb849a" (UID: "a4b80ae8-52f0-493d-a760-3b1304bb849a"). InnerVolumeSpecName "kube-api-access-9qjj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:40:04 crc kubenswrapper[4975]: I0318 13:40:04.996013 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qjj6\" (UniqueName: \"kubernetes.io/projected/a4b80ae8-52f0-493d-a760-3b1304bb849a-kube-api-access-9qjj6\") on node \"crc\" DevicePath \"\"" Mar 18 13:40:05 crc kubenswrapper[4975]: I0318 13:40:05.400433 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564020-wlcfg" event={"ID":"a4b80ae8-52f0-493d-a760-3b1304bb849a","Type":"ContainerDied","Data":"41912d982f7256e0f4a11533ce19e4b2cdb53e9e8936c567e5da7482840d23bf"} Mar 18 13:40:05 crc kubenswrapper[4975]: I0318 13:40:05.400473 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41912d982f7256e0f4a11533ce19e4b2cdb53e9e8936c567e5da7482840d23bf" Mar 18 13:40:05 crc kubenswrapper[4975]: I0318 13:40:05.400555 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564020-wlcfg" Mar 18 13:40:05 crc kubenswrapper[4975]: I0318 13:40:05.795494 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-d8wzl"] Mar 18 13:40:05 crc kubenswrapper[4975]: I0318 13:40:05.804768 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564014-d8wzl"] Mar 18 13:40:07 crc kubenswrapper[4975]: I0318 13:40:07.029509 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39d1114b-18c7-4f16-b805-e96bdb1adbc5" path="/var/lib/kubelet/pods/39d1114b-18c7-4f16-b805-e96bdb1adbc5/volumes" Mar 18 13:40:15 crc kubenswrapper[4975]: I0318 13:40:15.209327 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b44bbdbf8-vkj8f_ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf/barbican-api/0.log" Mar 18 13:40:15 crc kubenswrapper[4975]: I0318 13:40:15.450231 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6fc58d8444-6ln2h_0d8217c7-38b9-4717-a965-4a408d31fdc6/barbican-keystone-listener/0.log" Mar 18 13:40:15 crc kubenswrapper[4975]: I0318 13:40:15.458416 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b44bbdbf8-vkj8f_ae01c9f6-20f3-4d4b-9806-0cdfd74fdbbf/barbican-api-log/0.log" Mar 18 13:40:15 crc kubenswrapper[4975]: I0318 13:40:15.526023 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6fc58d8444-6ln2h_0d8217c7-38b9-4717-a965-4a408d31fdc6/barbican-keystone-listener-log/0.log" Mar 18 13:40:15 crc kubenswrapper[4975]: I0318 13:40:15.639346 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6744587899-pzwjz_4e530724-5f58-4bbc-9b9a-624a4565ab21/barbican-worker/0.log" Mar 18 13:40:15 crc kubenswrapper[4975]: I0318 13:40:15.643668 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6744587899-pzwjz_4e530724-5f58-4bbc-9b9a-624a4565ab21/barbican-worker-log/0.log" Mar 18 13:40:15 crc kubenswrapper[4975]: I0318 13:40:15.973351 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a1ff504e-305e-4180-ae3b-f1ee98e27726/ceilometer-central-agent/0.log" Mar 18 13:40:16 crc kubenswrapper[4975]: I0318 13:40:16.042502 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a1ff504e-305e-4180-ae3b-f1ee98e27726/ceilometer-notification-agent/0.log" Mar 18 13:40:16 crc kubenswrapper[4975]: I0318 13:40:16.156349 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a1ff504e-305e-4180-ae3b-f1ee98e27726/proxy-httpd/0.log" Mar 18 13:40:16 crc kubenswrapper[4975]: I0318 13:40:16.200442 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a1ff504e-305e-4180-ae3b-f1ee98e27726/sg-core/0.log" Mar 18 13:40:16 crc kubenswrapper[4975]: I0318 13:40:16.239985 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-r7hbd_296e864d-5a95-4c7a-b9ea-3d18bb1dfdcd/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:16 crc kubenswrapper[4975]: I0318 13:40:16.400898 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9f32d1b3-ee20-4ad3-a943-a83c87014cd0/cinder-api-log/0.log" Mar 18 13:40:16 crc kubenswrapper[4975]: I0318 13:40:16.443812 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9f32d1b3-ee20-4ad3-a943-a83c87014cd0/cinder-api/0.log" Mar 18 13:40:16 crc kubenswrapper[4975]: I0318 13:40:16.531708 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d63c8853-95df-41fd-99e4-ff384da2ef6f/cinder-scheduler/0.log" Mar 18 13:40:16 crc kubenswrapper[4975]: I0318 13:40:16.658499 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d63c8853-95df-41fd-99e4-ff384da2ef6f/probe/0.log" Mar 18 13:40:17 crc kubenswrapper[4975]: I0318 13:40:17.076394 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-wrsq2_09a27227-8777-4ddc-b4a0-ca2c7f8e66bf/init/0.log" Mar 18 13:40:17 crc kubenswrapper[4975]: I0318 13:40:17.098834 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-49rnp_ee1b246b-e80c-4d00-ab5e-013460f7e886/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:17 crc kubenswrapper[4975]: I0318 13:40:17.324392 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-wrsq2_09a27227-8777-4ddc-b4a0-ca2c7f8e66bf/dnsmasq-dns/0.log" Mar 18 13:40:17 crc kubenswrapper[4975]: I0318 13:40:17.324709 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-wrsq2_09a27227-8777-4ddc-b4a0-ca2c7f8e66bf/init/0.log" Mar 18 13:40:17 crc kubenswrapper[4975]: I0318 13:40:17.377923 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-f44vg_d5cb9baa-6a68-426f-88b4-4cb896f260e7/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:17 crc kubenswrapper[4975]: I0318 13:40:17.648002 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_87bb4a85-e699-429d-b354-f9e75d5eb9de/glance-httpd/0.log" Mar 18 13:40:17 crc kubenswrapper[4975]: I0318 13:40:17.653022 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-kckms_a56e99ba-eb18-4a6a-8347-564e7af719f7/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:17 crc kubenswrapper[4975]: I0318 13:40:17.656907 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_87bb4a85-e699-429d-b354-f9e75d5eb9de/glance-log/0.log" Mar 18 13:40:17 crc kubenswrapper[4975]: I0318 13:40:17.847034 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9a3dc59e-97e5-435c-b3c2-286d75774bbc/glance-log/0.log" Mar 18 13:40:17 crc kubenswrapper[4975]: I0318 13:40:17.882126 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9a3dc59e-97e5-435c-b3c2-286d75774bbc/glance-httpd/0.log" Mar 18 13:40:18 crc kubenswrapper[4975]: I0318 13:40:18.273131 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56bcd48494-744wv_939992e3-94eb-4c98-a493-e30321c7f81a/horizon/0.log" Mar 18 13:40:18 crc kubenswrapper[4975]: I0318 13:40:18.378151 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-p5nmt_8a6f8f80-45d5-428b-ae5f-0b770f0aefd8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:18 crc kubenswrapper[4975]: I0318 13:40:18.659641 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56bcd48494-744wv_939992e3-94eb-4c98-a493-e30321c7f81a/horizon-log/0.log" Mar 18 13:40:18 crc kubenswrapper[4975]: I0318 13:40:18.906647 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29563981-gj4mp_5b92e883-e662-4970-ab9b-df31247d4cb7/keystone-cron/0.log" Mar 18 13:40:19 crc kubenswrapper[4975]: I0318 13:40:19.162507 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-f8786c66f-vjkt4_b782f13d-9abe-4abe-a47e-9c378c9d1913/keystone-api/0.log" Mar 18 13:40:19 crc kubenswrapper[4975]: I0318 13:40:19.172061 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6eeb6017-b985-4f5c-ace2-b24c9ad25510/kube-state-metrics/0.log" Mar 18 13:40:19 crc kubenswrapper[4975]: I0318 13:40:19.324086 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-t9nfn_7b64debe-3d0c-4e5f-b45a-69c21b528483/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:21 crc kubenswrapper[4975]: I0318 13:40:21.344888 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-2sd8n_02e6a4ac-05a2-4da9-8494-3e01a9d443ff/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:21 crc kubenswrapper[4975]: I0318 13:40:21.478464 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-52whs_c5353d60-ff65-4a67-a566-00ef9a757cfb/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:21 crc kubenswrapper[4975]: I0318 13:40:21.756981 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-7zcpx_63653681-d7c8-4201-9922-32bfe87bc28f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:21 crc kubenswrapper[4975]: I0318 13:40:21.759815 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-k2wc9_06a6be48-3e59-499f-b3aa-f0a6f9bbe812/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:21 crc kubenswrapper[4975]: I0318 13:40:21.825575 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ttwqc_b1c9f3d9-ff18-40c6-81b4-9eed81219d55/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:22 crc kubenswrapper[4975]: I0318 13:40:22.419010 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67fdc8889-2cm4h_b6ab8f83-7ff0-46e4-9045-611e4b3b97c0/neutron-api/0.log" Mar 18 13:40:22 crc kubenswrapper[4975]: I0318 13:40:22.532285 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67fdc8889-2cm4h_b6ab8f83-7ff0-46e4-9045-611e4b3b97c0/neutron-httpd/0.log" Mar 18 13:40:23 crc kubenswrapper[4975]: I0318 13:40:23.042215 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-f2pbl_ebc50dc0-b3e2-49ba-a6cd-958fc27292ac/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:23 crc kubenswrapper[4975]: I0318 13:40:23.414727 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_84723d07-c2d7-457f-8627-3420d0a1d3ae/nova-api-log/0.log" Mar 18 13:40:23 crc kubenswrapper[4975]: I0318 13:40:23.970977 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_84723d07-c2d7-457f-8627-3420d0a1d3ae/nova-api-api/0.log" Mar 18 13:40:24 crc kubenswrapper[4975]: I0318 13:40:24.021904 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vsnfm_60d3b2e1-b9b0-4e8b-9f3a-c701bc6d9492/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:24 crc kubenswrapper[4975]: I0318 13:40:24.114906 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vlk29_ec93f0f6-3753-4db2-a239-859abb202fdc/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:24 crc kubenswrapper[4975]: I0318 13:40:24.225787 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_1bcb7ff1-b245-4d6a-ae0b-9acc133d2b65/nova-cell0-conductor-conductor/0.log" Mar 18 13:40:24 crc kubenswrapper[4975]: I0318 13:40:24.379909 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8ca44478-1a7f-4e96-9025-5cdac5e90dcc/nova-cell1-conductor-conductor/0.log" Mar 18 13:40:24 crc kubenswrapper[4975]: I0318 13:40:24.771339 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f5c52261-89b7-4457-a0b6-4380a99ffa2a/nova-cell1-novncproxy-novncproxy/0.log" Mar 18 13:40:24 crc kubenswrapper[4975]: I0318 13:40:24.900195 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_764dae90-1710-4d6b-bee7-67a26f5133a7/nova-metadata-log/0.log" Mar 18 13:40:25 crc kubenswrapper[4975]: I0318 13:40:25.144520 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_abf116ab-237e-4381-849a-ce0619c3ee09/nova-scheduler-scheduler/0.log" Mar 18 13:40:25 crc kubenswrapper[4975]: I0318 13:40:25.255688 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_52cabd28-7357-4e96-b812-637660ce5cd1/mysql-bootstrap/0.log" Mar 18 13:40:25 crc kubenswrapper[4975]: I0318 13:40:25.400089 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_764dae90-1710-4d6b-bee7-67a26f5133a7/nova-metadata-metadata/0.log" Mar 18 13:40:25 crc kubenswrapper[4975]: I0318 13:40:25.486944 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_52cabd28-7357-4e96-b812-637660ce5cd1/galera/0.log" Mar 18 13:40:25 crc kubenswrapper[4975]: I0318 13:40:25.489613 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_52cabd28-7357-4e96-b812-637660ce5cd1/mysql-bootstrap/0.log" Mar 18 13:40:25 crc kubenswrapper[4975]: I0318 13:40:25.621792 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_bcac25a5-0552-488f-b468-ffea7a442115/mysql-bootstrap/0.log" Mar 18 13:40:25 crc kubenswrapper[4975]: I0318 13:40:25.817438 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_bcac25a5-0552-488f-b468-ffea7a442115/mysql-bootstrap/0.log" Mar 18 13:40:25 crc kubenswrapper[4975]: I0318 13:40:25.817696 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_eedc82f6-6487-41cf-b618-db58be6f1eed/openstackclient/0.log" Mar 18 13:40:25 crc kubenswrapper[4975]: I0318 13:40:25.822546 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_bcac25a5-0552-488f-b468-ffea7a442115/galera/0.log" Mar 18 13:40:26 crc kubenswrapper[4975]: I0318 13:40:26.119020 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-dtpb5_25ce2e6c-7802-4318-89e7-f9f40bd5369f/openstack-network-exporter/0.log" Mar 18 13:40:26 crc kubenswrapper[4975]: I0318 13:40:26.213240 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qpqb7_bc8e272b-99c9-460e-9b72-a031478baf07/ovsdb-server-init/0.log" Mar 18 13:40:26 crc kubenswrapper[4975]: I0318 13:40:26.432958 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qpqb7_bc8e272b-99c9-460e-9b72-a031478baf07/ovsdb-server/0.log" Mar 18 13:40:26 crc kubenswrapper[4975]: I0318 13:40:26.457621 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qpqb7_bc8e272b-99c9-460e-9b72-a031478baf07/ovsdb-server-init/0.log" Mar 18 13:40:26 crc kubenswrapper[4975]: I0318 13:40:26.519469 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qpqb7_bc8e272b-99c9-460e-9b72-a031478baf07/ovs-vswitchd/0.log" Mar 18 13:40:26 crc kubenswrapper[4975]: I0318 13:40:26.735803 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-zgkvs_f3553d46-cacf-43e1-886a-44c17ed9a6c5/ovn-controller/0.log" Mar 18 13:40:26 crc kubenswrapper[4975]: I0318 13:40:26.994758 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_30f134c7-8393-40ca-8c2b-1070ea5ec68c/ovn-northd/0.log" Mar 18 13:40:26 crc kubenswrapper[4975]: I0318 13:40:26.995312 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_30f134c7-8393-40ca-8c2b-1070ea5ec68c/openstack-network-exporter/0.log" Mar 18 13:40:26 crc kubenswrapper[4975]: I0318 13:40:26.996722 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-p6rfq_2603d47f-8543-4cc2-927e-5f7d2ad82acc/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:27 crc kubenswrapper[4975]: I0318 13:40:27.180099 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9f6a63f7-645b-44df-aea0-787e5596aecc/openstack-network-exporter/0.log" Mar 18 13:40:27 crc kubenswrapper[4975]: I0318 13:40:27.193307 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9f6a63f7-645b-44df-aea0-787e5596aecc/ovsdbserver-nb/0.log" Mar 18 13:40:27 crc kubenswrapper[4975]: I0318 13:40:27.290221 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dd63df8b-b3ee-49dc-b36b-157bb71ac6d5/openstack-network-exporter/0.log" Mar 18 13:40:27 crc kubenswrapper[4975]: I0318 13:40:27.447885 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dd63df8b-b3ee-49dc-b36b-157bb71ac6d5/ovsdbserver-sb/0.log" Mar 18 13:40:27 crc kubenswrapper[4975]: I0318 13:40:27.542283 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55d747f656-4fh7c_2615ee1a-a138-4ace-88dd-bda440399db9/placement-api/0.log" Mar 18 13:40:27 crc kubenswrapper[4975]: I0318 13:40:27.599420 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-55d747f656-4fh7c_2615ee1a-a138-4ace-88dd-bda440399db9/placement-log/0.log" Mar 18 13:40:27 crc kubenswrapper[4975]: I0318 13:40:27.658778 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3f543aea-ed0e-412b-8f30-bc585ce1793e/setup-container/0.log" Mar 18 13:40:28 crc kubenswrapper[4975]: I0318 13:40:28.080461 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3f543aea-ed0e-412b-8f30-bc585ce1793e/setup-container/0.log" Mar 18 13:40:28 crc kubenswrapper[4975]: I0318 13:40:28.089684 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3f543aea-ed0e-412b-8f30-bc585ce1793e/rabbitmq/0.log" Mar 18 13:40:28 crc kubenswrapper[4975]: I0318 13:40:28.153140 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_df91a8b9-ed19-4f64-9d3a-2c93bae6916a/setup-container/0.log" Mar 18 13:40:28 crc kubenswrapper[4975]: I0318 13:40:28.297427 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_df91a8b9-ed19-4f64-9d3a-2c93bae6916a/rabbitmq/0.log" Mar 18 13:40:28 crc kubenswrapper[4975]: I0318 13:40:28.373878 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6p6hb_bddaeca5-c768-4480-96cd-ef43fd303bc8/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:28 crc kubenswrapper[4975]: I0318 13:40:28.397605 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_df91a8b9-ed19-4f64-9d3a-2c93bae6916a/setup-container/0.log" Mar 18 13:40:28 crc kubenswrapper[4975]: I0318 13:40:28.636559 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-m5mvk_2c62f110-2c2e-4de8-a425-0b08794eb28f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:28 crc kubenswrapper[4975]: I0318 13:40:28.686022 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-6rvmd_43e10e84-5a96-4e1f-add7-c6a1c177b5ce/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:29 crc kubenswrapper[4975]: I0318 13:40:29.046386 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-fl5xt_bef7f8e0-ad4f-4e10-b925-dcbf363f9ed2/ssh-known-hosts-edpm-deployment/0.log" Mar 18 13:40:29 crc kubenswrapper[4975]: I0318 13:40:29.058887 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-r6qcs_7508606e-78ed-475c-9386-b1cc11127fb1/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:29 crc kubenswrapper[4975]: I0318 13:40:29.284229 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-749bf8fbcf-mc9c6_0f1ae896-bd35-40e2-bd0f-35cf15db5e2d/proxy-server/0.log" Mar 18 13:40:29 crc kubenswrapper[4975]: I0318 13:40:29.343002 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-g5bjt_f0f58451-e968-467f-8d95-7a4c5104ce12/swift-ring-rebalance/0.log" Mar 18 13:40:29 crc kubenswrapper[4975]: I0318 13:40:29.445485 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-749bf8fbcf-mc9c6_0f1ae896-bd35-40e2-bd0f-35cf15db5e2d/proxy-httpd/0.log" Mar 18 13:40:29 crc kubenswrapper[4975]: I0318 13:40:29.565518 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18526bd6-7184-4e92-8bb6-f85ec1aa3f30/account-auditor/0.log" Mar 18 13:40:29 crc kubenswrapper[4975]: I0318 13:40:29.636383 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18526bd6-7184-4e92-8bb6-f85ec1aa3f30/account-reaper/0.log" Mar 18 13:40:29 crc kubenswrapper[4975]: I0318 13:40:29.756524 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18526bd6-7184-4e92-8bb6-f85ec1aa3f30/account-replicator/0.log" Mar 18 13:40:29 crc kubenswrapper[4975]: I0318 13:40:29.806121 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18526bd6-7184-4e92-8bb6-f85ec1aa3f30/container-auditor/0.log" Mar 18 13:40:29 crc kubenswrapper[4975]: I0318 13:40:29.817847 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18526bd6-7184-4e92-8bb6-f85ec1aa3f30/account-server/0.log" Mar 18 13:40:29 crc kubenswrapper[4975]: I0318 13:40:29.918658 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18526bd6-7184-4e92-8bb6-f85ec1aa3f30/container-replicator/0.log" Mar 18 13:40:29 crc kubenswrapper[4975]: I0318 13:40:29.969281 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18526bd6-7184-4e92-8bb6-f85ec1aa3f30/container-server/0.log" Mar 18 13:40:30 crc kubenswrapper[4975]: I0318 13:40:30.017957 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18526bd6-7184-4e92-8bb6-f85ec1aa3f30/container-updater/0.log" Mar 18 13:40:30 crc kubenswrapper[4975]: I0318 13:40:30.062615 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18526bd6-7184-4e92-8bb6-f85ec1aa3f30/object-auditor/0.log" Mar 18 13:40:30 crc kubenswrapper[4975]: I0318 13:40:30.153934 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18526bd6-7184-4e92-8bb6-f85ec1aa3f30/object-expirer/0.log" Mar 18 13:40:30 crc kubenswrapper[4975]: I0318 13:40:30.250321 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18526bd6-7184-4e92-8bb6-f85ec1aa3f30/object-server/0.log" Mar 18 13:40:30 crc kubenswrapper[4975]: I0318 13:40:30.288924 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18526bd6-7184-4e92-8bb6-f85ec1aa3f30/object-replicator/0.log" Mar 18 13:40:30 crc kubenswrapper[4975]: I0318 13:40:30.341722 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18526bd6-7184-4e92-8bb6-f85ec1aa3f30/rsync/0.log" Mar 18 13:40:30 crc kubenswrapper[4975]: I0318 13:40:30.369487 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18526bd6-7184-4e92-8bb6-f85ec1aa3f30/object-updater/0.log" Mar 18 13:40:30 crc kubenswrapper[4975]: I0318 13:40:30.486416 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_18526bd6-7184-4e92-8bb6-f85ec1aa3f30/swift-recon-cron/0.log" Mar 18 13:40:30 crc kubenswrapper[4975]: I0318 13:40:30.618266 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-lgsnw_48982599-8240-4f82-8932-5ded98c54dfa/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 18 13:40:36 crc kubenswrapper[4975]: I0318 13:40:36.268451 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_56b76c27-2e96-46ad-a648-705b85b28bd6/memcached/0.log" Mar 18 13:40:54 crc kubenswrapper[4975]: I0318 13:40:54.337219 4975 scope.go:117] "RemoveContainer" containerID="eeea74c5fc61293f5ef084cceb2be71eaafac98c81ddaa9ccd5d6db70744671c" Mar 18 13:40:57 crc kubenswrapper[4975]: I0318 13:40:57.556845 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg_38e210dc-b7c2-447c-9c0c-324bc2c5176a/util/0.log" Mar 18 13:40:57 crc kubenswrapper[4975]: I0318 13:40:57.696956 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg_38e210dc-b7c2-447c-9c0c-324bc2c5176a/util/0.log" Mar 18 13:40:57 crc kubenswrapper[4975]: I0318 13:40:57.783101 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg_38e210dc-b7c2-447c-9c0c-324bc2c5176a/pull/0.log" Mar 18 13:40:57 crc kubenswrapper[4975]: I0318 13:40:57.803458 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg_38e210dc-b7c2-447c-9c0c-324bc2c5176a/pull/0.log" Mar 18 13:40:57 crc kubenswrapper[4975]: I0318 13:40:57.980265 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg_38e210dc-b7c2-447c-9c0c-324bc2c5176a/util/0.log" Mar 18 13:40:57 crc kubenswrapper[4975]: I0318 13:40:57.983036 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg_38e210dc-b7c2-447c-9c0c-324bc2c5176a/pull/0.log" Mar 18 13:40:58 crc kubenswrapper[4975]: I0318 13:40:58.071318 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f1136f03c001d4ee37f05591eb7258fdc593d3c6da582b1d34c16540e7p2vg_38e210dc-b7c2-447c-9c0c-324bc2c5176a/extract/0.log" Mar 18 13:40:58 crc kubenswrapper[4975]: I0318 13:40:58.235280 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-9hd99_4b5bf68a-b7e7-4df7-ab69-2b8eac9c1091/manager/0.log" Mar 18 13:40:58 crc kubenswrapper[4975]: I0318 13:40:58.519284 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-dp5dh_675b1757-67bf-4d6d-9947-31a4da13a1be/manager/0.log" Mar 18 13:40:58 crc kubenswrapper[4975]: I0318 13:40:58.705132 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-wjt2j_1b122375-f65a-4f05-a738-41eab6a8fcd3/manager/0.log" Mar 18 13:40:58 crc kubenswrapper[4975]: I0318 13:40:58.817834 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-v4ngr_14e0c597-a515-4b44-908e-3737f385d7c3/manager/0.log" Mar 18 13:40:58 crc kubenswrapper[4975]: I0318 13:40:58.982825 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-jthrm_2c3688cf-2e2c-434c-88a7-10ac1a4949b2/manager/0.log" Mar 18 13:40:59 crc kubenswrapper[4975]: I0318 13:40:59.370349 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-hqgd2_4509daad-a22e-4801-891d-b0b8ea78ccb0/manager/0.log" Mar 18 13:40:59 crc kubenswrapper[4975]: I0318 13:40:59.672435 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-tfkfx_d854c129-c4eb-4c08-a398-3549f4ff9047/manager/0.log" Mar 18 13:40:59 crc kubenswrapper[4975]: I0318 13:40:59.830597 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-lcsc9_13b9a062-4728-4ce6-8d1b-0206bb73684e/manager/0.log" Mar 18 13:40:59 crc kubenswrapper[4975]: I0318 13:40:59.949531 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-qhnk6_dc261851-abff-4a1e-b20e-07d9c3bea942/manager/0.log" Mar 18 13:41:00 crc kubenswrapper[4975]: I0318 13:41:00.052340 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-7lzk9_84bd8990-a50a-4fb2-88d3-e3141ef24b7d/manager/0.log" Mar 18 13:41:00 crc kubenswrapper[4975]: I0318 13:41:00.187316 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-9bhc7_68cf7624-cb7b-45df-a18f-7dbfb9c20f6f/manager/0.log" Mar 18 13:41:00 crc kubenswrapper[4975]: I0318 13:41:00.287733 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-d7vwb_aa02f9ee-8f7a-4880-b4c3-f2fcacd24967/manager/0.log" Mar 18 13:41:00 crc kubenswrapper[4975]: I0318 13:41:00.469190 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-tsmhh_3b4ea941-e8b1-47df-b33a-97dbf829cc24/manager/0.log" Mar 18 13:41:00 crc kubenswrapper[4975]: I0318 13:41:00.594465 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-kmlsc_7af89304-a07d-449c-9c16-97b829fa8290/manager/0.log" Mar 18 13:41:00 crc kubenswrapper[4975]: I0318 13:41:00.675794 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-lxkwk_0542b387-d20c-41a1-81f3-1a11228e0a5c/manager/0.log" Mar 18 13:41:00 crc kubenswrapper[4975]: I0318 13:41:00.992638 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-68ccf9867-9mxng_a4d01e56-0d21-4c32-9f31-a1adc02598db/operator/0.log" Mar 18 13:41:01 crc kubenswrapper[4975]: I0318 13:41:01.060100 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-s7jn8_3b9db44a-8c71-40d7-b3b5-b70cd7b32bdf/registry-server/0.log" Mar 18 13:41:01 crc kubenswrapper[4975]: I0318 13:41:01.274427 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-mwbb7_37a66a0e-cc0b-4e2b-8b41-62c23ae539d9/manager/0.log" Mar 18 13:41:01 crc kubenswrapper[4975]: I0318 13:41:01.570714 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-m6rkz_e48eb7b2-9ce8-465c-9f05-91b4c55b4867/manager/0.log" Mar 18 13:41:01 crc kubenswrapper[4975]: I0318 13:41:01.624026 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-c285h_fb3873ad-2d10-4df3-8198-2acd5c04b8c2/operator/0.log" Mar 18 13:41:01 crc kubenswrapper[4975]: I0318 13:41:01.799067 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-jlnvb_b224d92b-1aed-47b5-8825-8f0b11da3092/manager/0.log" Mar 18 13:41:01 crc kubenswrapper[4975]: I0318 13:41:01.995218 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-xfr6h_475c84a8-c3d3-4bf8-91e0-244d5ffe1c9e/manager/0.log" Mar 18 13:41:01 crc kubenswrapper[4975]: I0318 13:41:01.995659 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-d2wsm_fc5647e0-697f-490e-9413-9fb2e63b22d8/manager/0.log" Mar 18 13:41:02 crc kubenswrapper[4975]: I0318 13:41:02.324749 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-76c5949666-qk6mw_07e8604b-3e82-4a30-8f59-e240bd72d1a3/manager/0.log" Mar 18 13:41:02 crc kubenswrapper[4975]: I0318 13:41:02.352559 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-btfw4_fff94ef3-68c9-412e-adc7-c385a89a445f/manager/0.log" Mar 18 13:41:21 crc kubenswrapper[4975]: I0318 13:41:21.592690 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bkb4h_a54deeb5-bea0-4f51-aa4e-07df30bbf228/control-plane-machine-set-operator/0.log" Mar 18 13:41:21 crc kubenswrapper[4975]: I0318 13:41:21.762766 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6hg9j_22d232b9-7867-4587-9c0b-d6adba1cd8bd/kube-rbac-proxy/0.log" Mar 18 13:41:21 crc kubenswrapper[4975]: I0318 13:41:21.794701 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6hg9j_22d232b9-7867-4587-9c0b-d6adba1cd8bd/machine-api-operator/0.log" Mar 18 13:41:25 crc kubenswrapper[4975]: I0318 13:41:25.539950 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:41:25 crc kubenswrapper[4975]: I0318 13:41:25.540344 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:41:33 crc kubenswrapper[4975]: I0318 13:41:33.661799 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-lcppz_6b468fbd-2305-46c9-a021-5255a31d57ee/cert-manager-controller/0.log" Mar 18 13:41:33 crc kubenswrapper[4975]: I0318 13:41:33.776595 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-vc9kt_2b0c55e5-89aa-40ed-8232-e50bf89e63f5/cert-manager-cainjector/0.log" Mar 18 13:41:33 crc kubenswrapper[4975]: I0318 13:41:33.846941 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-d8f4x_3110dc21-85fd-41d1-b8ed-82ae5d760397/cert-manager-webhook/0.log" Mar 18 13:41:45 crc kubenswrapper[4975]: I0318 13:41:45.554097 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-nxq2q_7e7595f3-c2ff-4b57-90ad-3310743b2291/nmstate-console-plugin/0.log" Mar 18 13:41:45 crc kubenswrapper[4975]: I0318 13:41:45.723011 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-gv79c_eb225299-e366-4be1-8d6f-7419220bc147/nmstate-handler/0.log" Mar 18 13:41:45 crc kubenswrapper[4975]: I0318 13:41:45.765229 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-k77j9_c7cf9899-3af1-426c-9f6d-8a157c4cdd02/kube-rbac-proxy/0.log" Mar 18 13:41:45 crc kubenswrapper[4975]: I0318 13:41:45.835339 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-k77j9_c7cf9899-3af1-426c-9f6d-8a157c4cdd02/nmstate-metrics/0.log" Mar 18 13:41:45 crc kubenswrapper[4975]: I0318 13:41:45.965005 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-cd2hp_c4901424-0b59-4410-8896-4868b5d83f75/nmstate-operator/0.log" Mar 18 13:41:46 crc kubenswrapper[4975]: I0318 13:41:46.024226 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-qv8h5_8f3d4654-fe47-469f-bda9-d8c111d8f22d/nmstate-webhook/0.log" Mar 18 13:41:55 crc kubenswrapper[4975]: I0318 13:41:55.538689 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:41:55 crc kubenswrapper[4975]: I0318 13:41:55.539202 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:42:00 crc kubenswrapper[4975]: I0318 13:42:00.147091 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564022-5cd8v"] Mar 18 13:42:00 crc kubenswrapper[4975]: E0318 13:42:00.148021 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b80ae8-52f0-493d-a760-3b1304bb849a" containerName="oc" Mar 18 13:42:00 crc kubenswrapper[4975]: I0318 13:42:00.148037 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b80ae8-52f0-493d-a760-3b1304bb849a" containerName="oc" Mar 18 13:42:00 crc kubenswrapper[4975]: I0318 13:42:00.148256 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b80ae8-52f0-493d-a760-3b1304bb849a" containerName="oc" Mar 18 13:42:00 crc kubenswrapper[4975]: I0318 13:42:00.148964 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-5cd8v" Mar 18 13:42:00 crc kubenswrapper[4975]: I0318 13:42:00.151629 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:42:00 crc kubenswrapper[4975]: I0318 13:42:00.152065 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:42:00 crc kubenswrapper[4975]: I0318 13:42:00.152078 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:42:00 crc kubenswrapper[4975]: I0318 13:42:00.165793 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-5cd8v"] Mar 18 13:42:00 crc kubenswrapper[4975]: I0318 13:42:00.292957 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcqwc\" (UniqueName: \"kubernetes.io/projected/d391f902-2a5a-4a37-8a5c-f3cc0279d5ae-kube-api-access-wcqwc\") pod \"auto-csr-approver-29564022-5cd8v\" (UID: \"d391f902-2a5a-4a37-8a5c-f3cc0279d5ae\") " pod="openshift-infra/auto-csr-approver-29564022-5cd8v" Mar 18 13:42:00 crc kubenswrapper[4975]: I0318 13:42:00.394511 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcqwc\" (UniqueName: \"kubernetes.io/projected/d391f902-2a5a-4a37-8a5c-f3cc0279d5ae-kube-api-access-wcqwc\") pod \"auto-csr-approver-29564022-5cd8v\" (UID: \"d391f902-2a5a-4a37-8a5c-f3cc0279d5ae\") " pod="openshift-infra/auto-csr-approver-29564022-5cd8v" Mar 18 13:42:00 crc kubenswrapper[4975]: I0318 13:42:00.422556 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcqwc\" (UniqueName: \"kubernetes.io/projected/d391f902-2a5a-4a37-8a5c-f3cc0279d5ae-kube-api-access-wcqwc\") pod \"auto-csr-approver-29564022-5cd8v\" (UID: \"d391f902-2a5a-4a37-8a5c-f3cc0279d5ae\") " pod="openshift-infra/auto-csr-approver-29564022-5cd8v" Mar 18 13:42:00 crc kubenswrapper[4975]: I0318 13:42:00.474435 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-5cd8v" Mar 18 13:42:01 crc kubenswrapper[4975]: I0318 13:42:00.972501 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-5cd8v"] Mar 18 13:42:01 crc kubenswrapper[4975]: I0318 13:42:01.543011 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564022-5cd8v" event={"ID":"d391f902-2a5a-4a37-8a5c-f3cc0279d5ae","Type":"ContainerStarted","Data":"4f6525e34f9f4de68d82a90c87ade115d1c1b1238ab33eb0867278ce7398f1c0"} Mar 18 13:42:03 crc kubenswrapper[4975]: I0318 13:42:03.572000 4975 generic.go:334] "Generic (PLEG): container finished" podID="d391f902-2a5a-4a37-8a5c-f3cc0279d5ae" containerID="a3a04f5551a724b33d35bbf8ade1aad0a796de2fcbe84388f3e238ae72b743dd" exitCode=0 Mar 18 13:42:03 crc kubenswrapper[4975]: I0318 13:42:03.572122 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564022-5cd8v" event={"ID":"d391f902-2a5a-4a37-8a5c-f3cc0279d5ae","Type":"ContainerDied","Data":"a3a04f5551a724b33d35bbf8ade1aad0a796de2fcbe84388f3e238ae72b743dd"} Mar 18 13:42:04 crc kubenswrapper[4975]: I0318 13:42:04.929534 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-5cd8v" Mar 18 13:42:04 crc kubenswrapper[4975]: I0318 13:42:04.986390 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcqwc\" (UniqueName: \"kubernetes.io/projected/d391f902-2a5a-4a37-8a5c-f3cc0279d5ae-kube-api-access-wcqwc\") pod \"d391f902-2a5a-4a37-8a5c-f3cc0279d5ae\" (UID: \"d391f902-2a5a-4a37-8a5c-f3cc0279d5ae\") " Mar 18 13:42:04 crc kubenswrapper[4975]: I0318 13:42:04.992447 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d391f902-2a5a-4a37-8a5c-f3cc0279d5ae-kube-api-access-wcqwc" (OuterVolumeSpecName: "kube-api-access-wcqwc") pod "d391f902-2a5a-4a37-8a5c-f3cc0279d5ae" (UID: "d391f902-2a5a-4a37-8a5c-f3cc0279d5ae"). InnerVolumeSpecName "kube-api-access-wcqwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:05 crc kubenswrapper[4975]: I0318 13:42:05.090103 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcqwc\" (UniqueName: \"kubernetes.io/projected/d391f902-2a5a-4a37-8a5c-f3cc0279d5ae-kube-api-access-wcqwc\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:05 crc kubenswrapper[4975]: I0318 13:42:05.589356 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564022-5cd8v" event={"ID":"d391f902-2a5a-4a37-8a5c-f3cc0279d5ae","Type":"ContainerDied","Data":"4f6525e34f9f4de68d82a90c87ade115d1c1b1238ab33eb0867278ce7398f1c0"} Mar 18 13:42:05 crc kubenswrapper[4975]: I0318 13:42:05.589394 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f6525e34f9f4de68d82a90c87ade115d1c1b1238ab33eb0867278ce7398f1c0" Mar 18 13:42:05 crc kubenswrapper[4975]: I0318 13:42:05.589475 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564022-5cd8v" Mar 18 13:42:06 crc kubenswrapper[4975]: I0318 13:42:06.013588 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-sbwv4"] Mar 18 13:42:06 crc kubenswrapper[4975]: I0318 13:42:06.024440 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564016-sbwv4"] Mar 18 13:42:07 crc kubenswrapper[4975]: I0318 13:42:07.026381 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="759d4116-47b2-4603-87a7-26a7cb973067" path="/var/lib/kubelet/pods/759d4116-47b2-4603-87a7-26a7cb973067/volumes" Mar 18 13:42:11 crc kubenswrapper[4975]: I0318 13:42:11.579286 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kz6mt_2a6d86bf-7828-418f-91f4-41df21916eb4/kube-rbac-proxy/0.log" Mar 18 13:42:11 crc kubenswrapper[4975]: I0318 13:42:11.801305 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/cp-frr-files/0.log" Mar 18 13:42:11 crc kubenswrapper[4975]: I0318 13:42:11.802949 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kz6mt_2a6d86bf-7828-418f-91f4-41df21916eb4/controller/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.007548 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/cp-reloader/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.039601 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/cp-reloader/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.044104 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/cp-frr-files/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.058466 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/cp-metrics/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.218335 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/cp-frr-files/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.267335 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/cp-reloader/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.272535 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/cp-metrics/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.303513 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/cp-metrics/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.479645 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/cp-metrics/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.481513 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/cp-reloader/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.495680 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/cp-frr-files/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.512528 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/controller/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.678322 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/frr-metrics/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.710136 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/kube-rbac-proxy/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.731417 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/kube-rbac-proxy-frr/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.957643 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/reloader/0.log" Mar 18 13:42:12 crc kubenswrapper[4975]: I0318 13:42:12.992643 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-vqh9l_d9770b1b-8549-4f2a-967f-e2e3e36f9c6c/frr-k8s-webhook-server/0.log" Mar 18 13:42:13 crc kubenswrapper[4975]: I0318 13:42:13.191820 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-86bbb6cbf8-c5s8m_ac116d37-f5ed-40e6-b688-cdb1079a6727/manager/0.log" Mar 18 13:42:13 crc kubenswrapper[4975]: I0318 13:42:13.495489 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nngfh_38873f7c-5cbd-48c0-ba83-0d479218b7ac/kube-rbac-proxy/0.log" Mar 18 13:42:13 crc kubenswrapper[4975]: I0318 13:42:13.522399 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-598b644cc-hmnjm_c3919877-0c39-4a6c-862f-5e448870427f/webhook-server/0.log" Mar 18 13:42:14 crc kubenswrapper[4975]: I0318 13:42:14.175684 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nngfh_38873f7c-5cbd-48c0-ba83-0d479218b7ac/speaker/0.log" Mar 18 13:42:14 crc kubenswrapper[4975]: I0318 13:42:14.796842 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pslqs_eda60759-e685-41fd-9d34-6b1afdc1a8b9/frr/0.log" Mar 18 13:42:25 crc kubenswrapper[4975]: I0318 13:42:25.538925 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:42:25 crc kubenswrapper[4975]: I0318 13:42:25.539504 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:42:25 crc kubenswrapper[4975]: I0318 13:42:25.539543 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 13:42:25 crc kubenswrapper[4975]: I0318 13:42:25.790348 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e4090f30bdc2c45c39660d9b590c2d973aeca4e258919be7c997a1fe3dc3e461"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:42:25 crc kubenswrapper[4975]: I0318 13:42:25.790419 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://e4090f30bdc2c45c39660d9b590c2d973aeca4e258919be7c997a1fe3dc3e461" gracePeriod=600 Mar 18 13:42:26 crc kubenswrapper[4975]: I0318 13:42:26.181621 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2_128d4df9-9451-466a-a545-b916760c3c45/util/0.log" Mar 18 13:42:26 crc kubenswrapper[4975]: I0318 13:42:26.382941 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2_128d4df9-9451-466a-a545-b916760c3c45/pull/0.log" Mar 18 13:42:26 crc kubenswrapper[4975]: I0318 13:42:26.399934 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2_128d4df9-9451-466a-a545-b916760c3c45/util/0.log" Mar 18 13:42:26 crc kubenswrapper[4975]: I0318 13:42:26.400774 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2_128d4df9-9451-466a-a545-b916760c3c45/pull/0.log" Mar 18 13:42:26 crc kubenswrapper[4975]: I0318 13:42:26.610218 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2_128d4df9-9451-466a-a545-b916760c3c45/util/0.log" Mar 18 13:42:26 crc kubenswrapper[4975]: I0318 13:42:26.653230 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2_128d4df9-9451-466a-a545-b916760c3c45/pull/0.log" Mar 18 13:42:26 crc kubenswrapper[4975]: I0318 13:42:26.660923 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8745qsj2_128d4df9-9451-466a-a545-b916760c3c45/extract/0.log" Mar 18 13:42:26 crc kubenswrapper[4975]: I0318 13:42:26.799401 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="e4090f30bdc2c45c39660d9b590c2d973aeca4e258919be7c997a1fe3dc3e461" exitCode=0 Mar 18 13:42:26 crc kubenswrapper[4975]: I0318 13:42:26.799450 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"e4090f30bdc2c45c39660d9b590c2d973aeca4e258919be7c997a1fe3dc3e461"} Mar 18 13:42:26 crc kubenswrapper[4975]: I0318 13:42:26.799484 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerStarted","Data":"68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6"} Mar 18 13:42:26 crc kubenswrapper[4975]: I0318 13:42:26.799531 4975 scope.go:117] "RemoveContainer" containerID="e29025a785997d266fcdc32543fe4234a7b5edf85513cbccfdec8d6a22f0b519" Mar 18 13:42:26 crc kubenswrapper[4975]: I0318 13:42:26.809494 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw_b0848ced-9548-4c63-826b-12e73deed42d/util/0.log" Mar 18 13:42:26 crc kubenswrapper[4975]: I0318 13:42:26.987884 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw_b0848ced-9548-4c63-826b-12e73deed42d/util/0.log" Mar 18 13:42:26 crc kubenswrapper[4975]: I0318 13:42:26.993604 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw_b0848ced-9548-4c63-826b-12e73deed42d/pull/0.log" Mar 18 13:42:27 crc kubenswrapper[4975]: I0318 13:42:27.013482 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw_b0848ced-9548-4c63-826b-12e73deed42d/pull/0.log" Mar 18 13:42:27 crc kubenswrapper[4975]: I0318 13:42:27.163042 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw_b0848ced-9548-4c63-826b-12e73deed42d/util/0.log" Mar 18 13:42:27 crc kubenswrapper[4975]: I0318 13:42:27.175282 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw_b0848ced-9548-4c63-826b-12e73deed42d/extract/0.log" Mar 18 13:42:27 crc kubenswrapper[4975]: I0318 13:42:27.212112 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lfngw_b0848ced-9548-4c63-826b-12e73deed42d/pull/0.log" Mar 18 13:42:27 crc kubenswrapper[4975]: I0318 13:42:27.331836 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bcsvd_c58752ca-f22d-49b1-ac3c-f3cafb7c26e0/extract-utilities/0.log" Mar 18 13:42:27 crc kubenswrapper[4975]: I0318 13:42:27.505385 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bcsvd_c58752ca-f22d-49b1-ac3c-f3cafb7c26e0/extract-content/0.log" Mar 18 13:42:27 crc kubenswrapper[4975]: I0318 13:42:27.505927 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bcsvd_c58752ca-f22d-49b1-ac3c-f3cafb7c26e0/extract-utilities/0.log" Mar 18 13:42:27 crc kubenswrapper[4975]: I0318 13:42:27.525583 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bcsvd_c58752ca-f22d-49b1-ac3c-f3cafb7c26e0/extract-content/0.log" Mar 18 13:42:27 crc kubenswrapper[4975]: I0318 13:42:27.695362 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bcsvd_c58752ca-f22d-49b1-ac3c-f3cafb7c26e0/extract-content/0.log" Mar 18 13:42:27 crc kubenswrapper[4975]: I0318 13:42:27.791885 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bcsvd_c58752ca-f22d-49b1-ac3c-f3cafb7c26e0/extract-utilities/0.log" Mar 18 13:42:27 crc kubenswrapper[4975]: I0318 13:42:27.906083 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ttvcz_b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65/extract-utilities/0.log" Mar 18 13:42:28 crc kubenswrapper[4975]: I0318 13:42:28.125150 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ttvcz_b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65/extract-content/0.log" Mar 18 13:42:28 crc kubenswrapper[4975]: I0318 13:42:28.134732 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ttvcz_b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65/extract-utilities/0.log" Mar 18 13:42:28 crc kubenswrapper[4975]: I0318 13:42:28.187329 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ttvcz_b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65/extract-content/0.log" Mar 18 13:42:28 crc kubenswrapper[4975]: I0318 13:42:28.371138 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ttvcz_b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65/extract-utilities/0.log" Mar 18 13:42:28 crc kubenswrapper[4975]: I0318 13:42:28.372613 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ttvcz_b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65/extract-content/0.log" Mar 18 13:42:28 crc kubenswrapper[4975]: I0318 13:42:28.608186 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-5s8vm_1a1dafc0-c705-4110-b5f5-d622a2097f64/marketplace-operator/0.log" Mar 18 13:42:28 crc kubenswrapper[4975]: I0318 13:42:28.633248 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bcsvd_c58752ca-f22d-49b1-ac3c-f3cafb7c26e0/registry-server/0.log" Mar 18 13:42:28 crc kubenswrapper[4975]: I0318 13:42:28.846801 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l8xlg_e448c251-6293-491b-8d15-cd0ebd53d468/extract-utilities/0.log" Mar 18 13:42:29 crc kubenswrapper[4975]: I0318 13:42:29.223149 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l8xlg_e448c251-6293-491b-8d15-cd0ebd53d468/extract-utilities/0.log" Mar 18 13:42:29 crc kubenswrapper[4975]: I0318 13:42:29.292515 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l8xlg_e448c251-6293-491b-8d15-cd0ebd53d468/extract-content/0.log" Mar 18 13:42:29 crc kubenswrapper[4975]: I0318 13:42:29.300252 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ttvcz_b0d9a8a6-3ff4-42c5-8cdb-4f9f7e80ba65/registry-server/0.log" Mar 18 13:42:29 crc kubenswrapper[4975]: I0318 13:42:29.397528 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l8xlg_e448c251-6293-491b-8d15-cd0ebd53d468/extract-content/0.log" Mar 18 13:42:29 crc kubenswrapper[4975]: I0318 13:42:29.620377 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l8xlg_e448c251-6293-491b-8d15-cd0ebd53d468/extract-content/0.log" Mar 18 13:42:29 crc kubenswrapper[4975]: I0318 13:42:29.629455 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l8xlg_e448c251-6293-491b-8d15-cd0ebd53d468/extract-utilities/0.log" Mar 18 13:42:29 crc kubenswrapper[4975]: I0318 13:42:29.779532 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l8xlg_e448c251-6293-491b-8d15-cd0ebd53d468/registry-server/0.log" Mar 18 13:42:29 crc kubenswrapper[4975]: I0318 13:42:29.845818 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rnjqq_1f56e42a-99c9-4b38-bcad-bd1139570888/extract-utilities/0.log" Mar 18 13:42:29 crc kubenswrapper[4975]: I0318 13:42:29.984187 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rnjqq_1f56e42a-99c9-4b38-bcad-bd1139570888/extract-content/0.log" Mar 18 13:42:29 crc kubenswrapper[4975]: I0318 13:42:29.988038 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rnjqq_1f56e42a-99c9-4b38-bcad-bd1139570888/extract-utilities/0.log" Mar 18 13:42:29 crc kubenswrapper[4975]: I0318 13:42:29.992562 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rnjqq_1f56e42a-99c9-4b38-bcad-bd1139570888/extract-content/0.log" Mar 18 13:42:30 crc kubenswrapper[4975]: I0318 13:42:30.215571 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rnjqq_1f56e42a-99c9-4b38-bcad-bd1139570888/extract-content/0.log" Mar 18 13:42:30 crc kubenswrapper[4975]: I0318 13:42:30.243320 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rnjqq_1f56e42a-99c9-4b38-bcad-bd1139570888/extract-utilities/0.log" Mar 18 13:42:30 crc kubenswrapper[4975]: I0318 13:42:30.948192 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rnjqq_1f56e42a-99c9-4b38-bcad-bd1139570888/registry-server/0.log" Mar 18 13:42:42 crc kubenswrapper[4975]: I0318 13:42:42.868565 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t8rsv"] Mar 18 13:42:42 crc kubenswrapper[4975]: E0318 13:42:42.869468 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d391f902-2a5a-4a37-8a5c-f3cc0279d5ae" containerName="oc" Mar 18 13:42:42 crc kubenswrapper[4975]: I0318 13:42:42.869486 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="d391f902-2a5a-4a37-8a5c-f3cc0279d5ae" containerName="oc" Mar 18 13:42:42 crc kubenswrapper[4975]: I0318 13:42:42.869661 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="d391f902-2a5a-4a37-8a5c-f3cc0279d5ae" containerName="oc" Mar 18 13:42:42 crc kubenswrapper[4975]: I0318 13:42:42.875027 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:42 crc kubenswrapper[4975]: I0318 13:42:42.884027 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t8rsv"] Mar 18 13:42:43 crc kubenswrapper[4975]: I0318 13:42:43.020091 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrxqx\" (UniqueName: \"kubernetes.io/projected/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-kube-api-access-vrxqx\") pod \"community-operators-t8rsv\" (UID: \"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f\") " pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:43 crc kubenswrapper[4975]: I0318 13:42:43.020300 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-catalog-content\") pod \"community-operators-t8rsv\" (UID: \"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f\") " pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:43 crc kubenswrapper[4975]: I0318 13:42:43.020413 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-utilities\") pod \"community-operators-t8rsv\" (UID: \"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f\") " pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:43 crc kubenswrapper[4975]: I0318 13:42:43.122436 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-catalog-content\") pod \"community-operators-t8rsv\" (UID: \"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f\") " pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:43 crc kubenswrapper[4975]: I0318 13:42:43.122810 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-utilities\") pod \"community-operators-t8rsv\" (UID: \"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f\") " pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:43 crc kubenswrapper[4975]: I0318 13:42:43.122924 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrxqx\" (UniqueName: \"kubernetes.io/projected/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-kube-api-access-vrxqx\") pod \"community-operators-t8rsv\" (UID: \"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f\") " pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:43 crc kubenswrapper[4975]: I0318 13:42:43.123087 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-catalog-content\") pod \"community-operators-t8rsv\" (UID: \"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f\") " pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:43 crc kubenswrapper[4975]: I0318 13:42:43.123348 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-utilities\") pod \"community-operators-t8rsv\" (UID: \"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f\") " pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:43 crc kubenswrapper[4975]: I0318 13:42:43.141673 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrxqx\" (UniqueName: \"kubernetes.io/projected/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-kube-api-access-vrxqx\") pod \"community-operators-t8rsv\" (UID: \"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f\") " pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:43 crc kubenswrapper[4975]: I0318 13:42:43.194263 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:43 crc kubenswrapper[4975]: I0318 13:42:43.683906 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t8rsv"] Mar 18 13:42:43 crc kubenswrapper[4975]: I0318 13:42:43.964555 4975 generic.go:334] "Generic (PLEG): container finished" podID="71d9e4d1-9132-4146-b7ba-0f53bfb7c51f" containerID="9dba5842acab42a383b445bd9a506223634a9ac8543111db052942bf22ecdabd" exitCode=0 Mar 18 13:42:43 crc kubenswrapper[4975]: I0318 13:42:43.964616 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8rsv" event={"ID":"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f","Type":"ContainerDied","Data":"9dba5842acab42a383b445bd9a506223634a9ac8543111db052942bf22ecdabd"} Mar 18 13:42:43 crc kubenswrapper[4975]: I0318 13:42:43.964648 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8rsv" event={"ID":"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f","Type":"ContainerStarted","Data":"1b9fadac75aba5bc684c5632f75bc5697a9619c71206e91ee76d3a0a4815029e"} Mar 18 13:42:44 crc kubenswrapper[4975]: I0318 13:42:44.978055 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8rsv" event={"ID":"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f","Type":"ContainerStarted","Data":"1de0888d59e1c11de52e0b6ed53112d238483228a7e65fabbcb9ff9ac4cd5c11"} Mar 18 13:42:45 crc kubenswrapper[4975]: I0318 13:42:45.989624 4975 generic.go:334] "Generic (PLEG): container finished" podID="71d9e4d1-9132-4146-b7ba-0f53bfb7c51f" containerID="1de0888d59e1c11de52e0b6ed53112d238483228a7e65fabbcb9ff9ac4cd5c11" exitCode=0 Mar 18 13:42:45 crc kubenswrapper[4975]: I0318 13:42:45.989661 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8rsv" event={"ID":"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f","Type":"ContainerDied","Data":"1de0888d59e1c11de52e0b6ed53112d238483228a7e65fabbcb9ff9ac4cd5c11"} Mar 18 13:42:48 crc kubenswrapper[4975]: I0318 13:42:48.017376 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8rsv" event={"ID":"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f","Type":"ContainerStarted","Data":"fc73532514624c30c759a30f3354f48c3215f4eceb4e23dd15482d4d03335106"} Mar 18 13:42:48 crc kubenswrapper[4975]: I0318 13:42:48.059760 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t8rsv" podStartSLOduration=3.56338808 podStartE2EDuration="6.059685696s" podCreationTimestamp="2026-03-18 13:42:42 +0000 UTC" firstStartedPulling="2026-03-18 13:42:43.968011302 +0000 UTC m=+5549.682411881" lastFinishedPulling="2026-03-18 13:42:46.464308918 +0000 UTC m=+5552.178709497" observedRunningTime="2026-03-18 13:42:48.047690822 +0000 UTC m=+5553.762091411" watchObservedRunningTime="2026-03-18 13:42:48.059685696 +0000 UTC m=+5553.774086275" Mar 18 13:42:53 crc kubenswrapper[4975]: I0318 13:42:53.195838 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:53 crc kubenswrapper[4975]: I0318 13:42:53.196368 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:53 crc kubenswrapper[4975]: I0318 13:42:53.279091 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:54 crc kubenswrapper[4975]: I0318 13:42:54.159580 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:54 crc kubenswrapper[4975]: I0318 13:42:54.441519 4975 scope.go:117] "RemoveContainer" containerID="005a1e7218798b53d7ca57892234586364e80ea09dfcacd41f5b81a465fe65dd" Mar 18 13:42:55 crc kubenswrapper[4975]: I0318 13:42:55.004043 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t8rsv"] Mar 18 13:42:56 crc kubenswrapper[4975]: I0318 13:42:56.112906 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t8rsv" podUID="71d9e4d1-9132-4146-b7ba-0f53bfb7c51f" containerName="registry-server" containerID="cri-o://fc73532514624c30c759a30f3354f48c3215f4eceb4e23dd15482d4d03335106" gracePeriod=2 Mar 18 13:42:56 crc kubenswrapper[4975]: I0318 13:42:56.660553 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:56 crc kubenswrapper[4975]: I0318 13:42:56.781579 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-utilities\") pod \"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f\" (UID: \"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f\") " Mar 18 13:42:56 crc kubenswrapper[4975]: I0318 13:42:56.781950 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrxqx\" (UniqueName: \"kubernetes.io/projected/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-kube-api-access-vrxqx\") pod \"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f\" (UID: \"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f\") " Mar 18 13:42:56 crc kubenswrapper[4975]: I0318 13:42:56.782065 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-catalog-content\") pod \"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f\" (UID: \"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f\") " Mar 18 13:42:56 crc kubenswrapper[4975]: I0318 13:42:56.782081 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-utilities" (OuterVolumeSpecName: "utilities") pod "71d9e4d1-9132-4146-b7ba-0f53bfb7c51f" (UID: "71d9e4d1-9132-4146-b7ba-0f53bfb7c51f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:42:56 crc kubenswrapper[4975]: I0318 13:42:56.782532 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:56 crc kubenswrapper[4975]: I0318 13:42:56.790847 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-kube-api-access-vrxqx" (OuterVolumeSpecName: "kube-api-access-vrxqx") pod "71d9e4d1-9132-4146-b7ba-0f53bfb7c51f" (UID: "71d9e4d1-9132-4146-b7ba-0f53bfb7c51f"). InnerVolumeSpecName "kube-api-access-vrxqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:42:56 crc kubenswrapper[4975]: I0318 13:42:56.843543 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71d9e4d1-9132-4146-b7ba-0f53bfb7c51f" (UID: "71d9e4d1-9132-4146-b7ba-0f53bfb7c51f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:42:56 crc kubenswrapper[4975]: I0318 13:42:56.884288 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrxqx\" (UniqueName: \"kubernetes.io/projected/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-kube-api-access-vrxqx\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:56 crc kubenswrapper[4975]: I0318 13:42:56.884329 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:42:57 crc kubenswrapper[4975]: I0318 13:42:57.135937 4975 generic.go:334] "Generic (PLEG): container finished" podID="71d9e4d1-9132-4146-b7ba-0f53bfb7c51f" containerID="fc73532514624c30c759a30f3354f48c3215f4eceb4e23dd15482d4d03335106" exitCode=0 Mar 18 13:42:57 crc kubenswrapper[4975]: I0318 13:42:57.136004 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8rsv" event={"ID":"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f","Type":"ContainerDied","Data":"fc73532514624c30c759a30f3354f48c3215f4eceb4e23dd15482d4d03335106"} Mar 18 13:42:57 crc kubenswrapper[4975]: I0318 13:42:57.136046 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8rsv" event={"ID":"71d9e4d1-9132-4146-b7ba-0f53bfb7c51f","Type":"ContainerDied","Data":"1b9fadac75aba5bc684c5632f75bc5697a9619c71206e91ee76d3a0a4815029e"} Mar 18 13:42:57 crc kubenswrapper[4975]: I0318 13:42:57.136073 4975 scope.go:117] "RemoveContainer" containerID="fc73532514624c30c759a30f3354f48c3215f4eceb4e23dd15482d4d03335106" Mar 18 13:42:57 crc kubenswrapper[4975]: I0318 13:42:57.136273 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8rsv" Mar 18 13:42:57 crc kubenswrapper[4975]: I0318 13:42:57.167979 4975 scope.go:117] "RemoveContainer" containerID="1de0888d59e1c11de52e0b6ed53112d238483228a7e65fabbcb9ff9ac4cd5c11" Mar 18 13:42:57 crc kubenswrapper[4975]: I0318 13:42:57.192846 4975 scope.go:117] "RemoveContainer" containerID="9dba5842acab42a383b445bd9a506223634a9ac8543111db052942bf22ecdabd" Mar 18 13:42:57 crc kubenswrapper[4975]: I0318 13:42:57.194669 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t8rsv"] Mar 18 13:42:57 crc kubenswrapper[4975]: I0318 13:42:57.212627 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t8rsv"] Mar 18 13:42:57 crc kubenswrapper[4975]: I0318 13:42:57.243130 4975 scope.go:117] "RemoveContainer" containerID="fc73532514624c30c759a30f3354f48c3215f4eceb4e23dd15482d4d03335106" Mar 18 13:42:57 crc kubenswrapper[4975]: E0318 13:42:57.244480 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc73532514624c30c759a30f3354f48c3215f4eceb4e23dd15482d4d03335106\": container with ID starting with fc73532514624c30c759a30f3354f48c3215f4eceb4e23dd15482d4d03335106 not found: ID does not exist" containerID="fc73532514624c30c759a30f3354f48c3215f4eceb4e23dd15482d4d03335106" Mar 18 13:42:57 crc kubenswrapper[4975]: I0318 13:42:57.244521 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc73532514624c30c759a30f3354f48c3215f4eceb4e23dd15482d4d03335106"} err="failed to get container status \"fc73532514624c30c759a30f3354f48c3215f4eceb4e23dd15482d4d03335106\": rpc error: code = NotFound desc = could not find container \"fc73532514624c30c759a30f3354f48c3215f4eceb4e23dd15482d4d03335106\": container with ID starting with fc73532514624c30c759a30f3354f48c3215f4eceb4e23dd15482d4d03335106 not found: ID does not exist" Mar 18 13:42:57 crc kubenswrapper[4975]: I0318 13:42:57.244545 4975 scope.go:117] "RemoveContainer" containerID="1de0888d59e1c11de52e0b6ed53112d238483228a7e65fabbcb9ff9ac4cd5c11" Mar 18 13:42:57 crc kubenswrapper[4975]: E0318 13:42:57.244813 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de0888d59e1c11de52e0b6ed53112d238483228a7e65fabbcb9ff9ac4cd5c11\": container with ID starting with 1de0888d59e1c11de52e0b6ed53112d238483228a7e65fabbcb9ff9ac4cd5c11 not found: ID does not exist" containerID="1de0888d59e1c11de52e0b6ed53112d238483228a7e65fabbcb9ff9ac4cd5c11" Mar 18 13:42:57 crc kubenswrapper[4975]: I0318 13:42:57.244836 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de0888d59e1c11de52e0b6ed53112d238483228a7e65fabbcb9ff9ac4cd5c11"} err="failed to get container status \"1de0888d59e1c11de52e0b6ed53112d238483228a7e65fabbcb9ff9ac4cd5c11\": rpc error: code = NotFound desc = could not find container \"1de0888d59e1c11de52e0b6ed53112d238483228a7e65fabbcb9ff9ac4cd5c11\": container with ID starting with 1de0888d59e1c11de52e0b6ed53112d238483228a7e65fabbcb9ff9ac4cd5c11 not found: ID does not exist" Mar 18 13:42:57 crc kubenswrapper[4975]: I0318 13:42:57.244849 4975 scope.go:117] "RemoveContainer" containerID="9dba5842acab42a383b445bd9a506223634a9ac8543111db052942bf22ecdabd" Mar 18 13:42:57 crc kubenswrapper[4975]: E0318 13:42:57.245078 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dba5842acab42a383b445bd9a506223634a9ac8543111db052942bf22ecdabd\": container with ID starting with 9dba5842acab42a383b445bd9a506223634a9ac8543111db052942bf22ecdabd not found: ID does not exist" containerID="9dba5842acab42a383b445bd9a506223634a9ac8543111db052942bf22ecdabd" Mar 18 13:42:57 crc kubenswrapper[4975]: I0318 13:42:57.245105 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dba5842acab42a383b445bd9a506223634a9ac8543111db052942bf22ecdabd"} err="failed to get container status \"9dba5842acab42a383b445bd9a506223634a9ac8543111db052942bf22ecdabd\": rpc error: code = NotFound desc = could not find container \"9dba5842acab42a383b445bd9a506223634a9ac8543111db052942bf22ecdabd\": container with ID starting with 9dba5842acab42a383b445bd9a506223634a9ac8543111db052942bf22ecdabd not found: ID does not exist" Mar 18 13:42:59 crc kubenswrapper[4975]: I0318 13:42:59.028423 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71d9e4d1-9132-4146-b7ba-0f53bfb7c51f" path="/var/lib/kubelet/pods/71d9e4d1-9132-4146-b7ba-0f53bfb7c51f/volumes" Mar 18 13:42:59 crc kubenswrapper[4975]: E0318 13:42:59.294584 4975 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.9:44870->38.129.56.9:46815: write tcp 38.129.56.9:44870->38.129.56.9:46815: write: broken pipe Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.056721 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wl9mf"] Mar 18 13:43:50 crc kubenswrapper[4975]: E0318 13:43:50.057844 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d9e4d1-9132-4146-b7ba-0f53bfb7c51f" containerName="registry-server" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.057877 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d9e4d1-9132-4146-b7ba-0f53bfb7c51f" containerName="registry-server" Mar 18 13:43:50 crc kubenswrapper[4975]: E0318 13:43:50.057899 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d9e4d1-9132-4146-b7ba-0f53bfb7c51f" containerName="extract-utilities" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.057907 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d9e4d1-9132-4146-b7ba-0f53bfb7c51f" containerName="extract-utilities" Mar 18 13:43:50 crc kubenswrapper[4975]: E0318 13:43:50.057927 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d9e4d1-9132-4146-b7ba-0f53bfb7c51f" containerName="extract-content" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.057935 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d9e4d1-9132-4146-b7ba-0f53bfb7c51f" containerName="extract-content" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.058151 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="71d9e4d1-9132-4146-b7ba-0f53bfb7c51f" containerName="registry-server" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.059817 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.070307 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjjwj\" (UniqueName: \"kubernetes.io/projected/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-kube-api-access-xjjwj\") pod \"redhat-marketplace-wl9mf\" (UID: \"0b56e320-a50d-4cf2-a6a8-400ba8e45d87\") " pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.070437 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-catalog-content\") pod \"redhat-marketplace-wl9mf\" (UID: \"0b56e320-a50d-4cf2-a6a8-400ba8e45d87\") " pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.070765 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-utilities\") pod \"redhat-marketplace-wl9mf\" (UID: \"0b56e320-a50d-4cf2-a6a8-400ba8e45d87\") " pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.084944 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl9mf"] Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.176737 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjjwj\" (UniqueName: \"kubernetes.io/projected/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-kube-api-access-xjjwj\") pod \"redhat-marketplace-wl9mf\" (UID: \"0b56e320-a50d-4cf2-a6a8-400ba8e45d87\") " pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.176841 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-catalog-content\") pod \"redhat-marketplace-wl9mf\" (UID: \"0b56e320-a50d-4cf2-a6a8-400ba8e45d87\") " pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.177011 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-utilities\") pod \"redhat-marketplace-wl9mf\" (UID: \"0b56e320-a50d-4cf2-a6a8-400ba8e45d87\") " pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.177957 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-utilities\") pod \"redhat-marketplace-wl9mf\" (UID: \"0b56e320-a50d-4cf2-a6a8-400ba8e45d87\") " pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.191686 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-catalog-content\") pod \"redhat-marketplace-wl9mf\" (UID: \"0b56e320-a50d-4cf2-a6a8-400ba8e45d87\") " pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.210512 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjjwj\" (UniqueName: \"kubernetes.io/projected/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-kube-api-access-xjjwj\") pod \"redhat-marketplace-wl9mf\" (UID: \"0b56e320-a50d-4cf2-a6a8-400ba8e45d87\") " pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.384737 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:43:50 crc kubenswrapper[4975]: I0318 13:43:50.881360 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl9mf"] Mar 18 13:43:51 crc kubenswrapper[4975]: I0318 13:43:51.655110 4975 generic.go:334] "Generic (PLEG): container finished" podID="0b56e320-a50d-4cf2-a6a8-400ba8e45d87" containerID="49a59e0374ae9a8e714d3840e253ab9d97b3436d35d01e89a7dfd74f7f831e43" exitCode=0 Mar 18 13:43:51 crc kubenswrapper[4975]: I0318 13:43:51.655198 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl9mf" event={"ID":"0b56e320-a50d-4cf2-a6a8-400ba8e45d87","Type":"ContainerDied","Data":"49a59e0374ae9a8e714d3840e253ab9d97b3436d35d01e89a7dfd74f7f831e43"} Mar 18 13:43:51 crc kubenswrapper[4975]: I0318 13:43:51.655542 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl9mf" event={"ID":"0b56e320-a50d-4cf2-a6a8-400ba8e45d87","Type":"ContainerStarted","Data":"8268d8b2fcd5d38c86803dd47c269b34da0f48a685300bb633a093485fa7f5ee"} Mar 18 13:43:51 crc kubenswrapper[4975]: I0318 13:43:51.658650 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:43:53 crc kubenswrapper[4975]: I0318 13:43:53.673378 4975 generic.go:334] "Generic (PLEG): container finished" podID="0b56e320-a50d-4cf2-a6a8-400ba8e45d87" containerID="f8c79b93a20b1940e87ecba5ebc5af1010f716dd5520421976394eb7dd4f4bed" exitCode=0 Mar 18 13:43:53 crc kubenswrapper[4975]: I0318 13:43:53.674509 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl9mf" event={"ID":"0b56e320-a50d-4cf2-a6a8-400ba8e45d87","Type":"ContainerDied","Data":"f8c79b93a20b1940e87ecba5ebc5af1010f716dd5520421976394eb7dd4f4bed"} Mar 18 13:43:54 crc kubenswrapper[4975]: I0318 13:43:54.701667 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl9mf" event={"ID":"0b56e320-a50d-4cf2-a6a8-400ba8e45d87","Type":"ContainerStarted","Data":"0f7657d9d31868671ed0d686a754c0f343b03025e10e254a816b76a063963b01"} Mar 18 13:43:54 crc kubenswrapper[4975]: I0318 13:43:54.722911 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wl9mf" podStartSLOduration=1.9678010540000002 podStartE2EDuration="4.722885394s" podCreationTimestamp="2026-03-18 13:43:50 +0000 UTC" firstStartedPulling="2026-03-18 13:43:51.658283708 +0000 UTC m=+5617.372684297" lastFinishedPulling="2026-03-18 13:43:54.413368058 +0000 UTC m=+5620.127768637" observedRunningTime="2026-03-18 13:43:54.716125931 +0000 UTC m=+5620.430526530" watchObservedRunningTime="2026-03-18 13:43:54.722885394 +0000 UTC m=+5620.437285973" Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.153932 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564024-r7xsg"] Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.163230 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-r7xsg" Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.169509 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.169674 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.169819 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.172371 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564024-r7xsg"] Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.183123 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9wbd\" (UniqueName: \"kubernetes.io/projected/4eaab4a7-c1e6-4829-8a8f-ea44b17c7179-kube-api-access-g9wbd\") pod \"auto-csr-approver-29564024-r7xsg\" (UID: \"4eaab4a7-c1e6-4829-8a8f-ea44b17c7179\") " pod="openshift-infra/auto-csr-approver-29564024-r7xsg" Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.285090 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9wbd\" (UniqueName: \"kubernetes.io/projected/4eaab4a7-c1e6-4829-8a8f-ea44b17c7179-kube-api-access-g9wbd\") pod \"auto-csr-approver-29564024-r7xsg\" (UID: \"4eaab4a7-c1e6-4829-8a8f-ea44b17c7179\") " pod="openshift-infra/auto-csr-approver-29564024-r7xsg" Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.316805 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9wbd\" (UniqueName: \"kubernetes.io/projected/4eaab4a7-c1e6-4829-8a8f-ea44b17c7179-kube-api-access-g9wbd\") pod \"auto-csr-approver-29564024-r7xsg\" (UID: \"4eaab4a7-c1e6-4829-8a8f-ea44b17c7179\") " pod="openshift-infra/auto-csr-approver-29564024-r7xsg" Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.385075 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.385349 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.433122 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.492229 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-r7xsg" Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.805326 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.857238 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl9mf"] Mar 18 13:44:00 crc kubenswrapper[4975]: I0318 13:44:00.922197 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564024-r7xsg"] Mar 18 13:44:01 crc kubenswrapper[4975]: I0318 13:44:01.760485 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564024-r7xsg" event={"ID":"4eaab4a7-c1e6-4829-8a8f-ea44b17c7179","Type":"ContainerStarted","Data":"a230901b1ef94da32451c68a864c3d2ab7ec70f780b827e993e80aaa6886eaf9"} Mar 18 13:44:02 crc kubenswrapper[4975]: I0318 13:44:02.771565 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wl9mf" podUID="0b56e320-a50d-4cf2-a6a8-400ba8e45d87" containerName="registry-server" containerID="cri-o://0f7657d9d31868671ed0d686a754c0f343b03025e10e254a816b76a063963b01" gracePeriod=2 Mar 18 13:44:02 crc kubenswrapper[4975]: I0318 13:44:02.772084 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564024-r7xsg" event={"ID":"4eaab4a7-c1e6-4829-8a8f-ea44b17c7179","Type":"ContainerStarted","Data":"774057f862f5fae61ed347dfa0b056c6a5770877ca67cf9209ab4a36ab9351e7"} Mar 18 13:44:02 crc kubenswrapper[4975]: I0318 13:44:02.792791 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564024-r7xsg" podStartSLOduration=1.233994187 podStartE2EDuration="2.792774446s" podCreationTimestamp="2026-03-18 13:44:00 +0000 UTC" firstStartedPulling="2026-03-18 13:44:00.93217522 +0000 UTC m=+5626.646575819" lastFinishedPulling="2026-03-18 13:44:02.490955499 +0000 UTC m=+5628.205356078" observedRunningTime="2026-03-18 13:44:02.784521723 +0000 UTC m=+5628.498922302" watchObservedRunningTime="2026-03-18 13:44:02.792774446 +0000 UTC m=+5628.507175025" Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.759905 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.772014 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjjwj\" (UniqueName: \"kubernetes.io/projected/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-kube-api-access-xjjwj\") pod \"0b56e320-a50d-4cf2-a6a8-400ba8e45d87\" (UID: \"0b56e320-a50d-4cf2-a6a8-400ba8e45d87\") " Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.779827 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-kube-api-access-xjjwj" (OuterVolumeSpecName: "kube-api-access-xjjwj") pod "0b56e320-a50d-4cf2-a6a8-400ba8e45d87" (UID: "0b56e320-a50d-4cf2-a6a8-400ba8e45d87"). InnerVolumeSpecName "kube-api-access-xjjwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.782881 4975 generic.go:334] "Generic (PLEG): container finished" podID="0b56e320-a50d-4cf2-a6a8-400ba8e45d87" containerID="0f7657d9d31868671ed0d686a754c0f343b03025e10e254a816b76a063963b01" exitCode=0 Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.782978 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl9mf" event={"ID":"0b56e320-a50d-4cf2-a6a8-400ba8e45d87","Type":"ContainerDied","Data":"0f7657d9d31868671ed0d686a754c0f343b03025e10e254a816b76a063963b01"} Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.783010 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wl9mf" event={"ID":"0b56e320-a50d-4cf2-a6a8-400ba8e45d87","Type":"ContainerDied","Data":"8268d8b2fcd5d38c86803dd47c269b34da0f48a685300bb633a093485fa7f5ee"} Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.783064 4975 scope.go:117] "RemoveContainer" containerID="0f7657d9d31868671ed0d686a754c0f343b03025e10e254a816b76a063963b01" Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.783298 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wl9mf" Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.791320 4975 generic.go:334] "Generic (PLEG): container finished" podID="4eaab4a7-c1e6-4829-8a8f-ea44b17c7179" containerID="774057f862f5fae61ed347dfa0b056c6a5770877ca67cf9209ab4a36ab9351e7" exitCode=0 Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.791375 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564024-r7xsg" event={"ID":"4eaab4a7-c1e6-4829-8a8f-ea44b17c7179","Type":"ContainerDied","Data":"774057f862f5fae61ed347dfa0b056c6a5770877ca67cf9209ab4a36ab9351e7"} Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.853553 4975 scope.go:117] "RemoveContainer" containerID="f8c79b93a20b1940e87ecba5ebc5af1010f716dd5520421976394eb7dd4f4bed" Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.870763 4975 scope.go:117] "RemoveContainer" containerID="49a59e0374ae9a8e714d3840e253ab9d97b3436d35d01e89a7dfd74f7f831e43" Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.873192 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-catalog-content\") pod \"0b56e320-a50d-4cf2-a6a8-400ba8e45d87\" (UID: \"0b56e320-a50d-4cf2-a6a8-400ba8e45d87\") " Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.873247 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-utilities\") pod \"0b56e320-a50d-4cf2-a6a8-400ba8e45d87\" (UID: \"0b56e320-a50d-4cf2-a6a8-400ba8e45d87\") " Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.873674 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjjwj\" (UniqueName: \"kubernetes.io/projected/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-kube-api-access-xjjwj\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.874025 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-utilities" (OuterVolumeSpecName: "utilities") pod "0b56e320-a50d-4cf2-a6a8-400ba8e45d87" (UID: "0b56e320-a50d-4cf2-a6a8-400ba8e45d87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.920561 4975 scope.go:117] "RemoveContainer" containerID="0f7657d9d31868671ed0d686a754c0f343b03025e10e254a816b76a063963b01" Mar 18 13:44:03 crc kubenswrapper[4975]: E0318 13:44:03.921157 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f7657d9d31868671ed0d686a754c0f343b03025e10e254a816b76a063963b01\": container with ID starting with 0f7657d9d31868671ed0d686a754c0f343b03025e10e254a816b76a063963b01 not found: ID does not exist" containerID="0f7657d9d31868671ed0d686a754c0f343b03025e10e254a816b76a063963b01" Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.921226 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7657d9d31868671ed0d686a754c0f343b03025e10e254a816b76a063963b01"} err="failed to get container status \"0f7657d9d31868671ed0d686a754c0f343b03025e10e254a816b76a063963b01\": rpc error: code = NotFound desc = could not find container \"0f7657d9d31868671ed0d686a754c0f343b03025e10e254a816b76a063963b01\": container with ID starting with 0f7657d9d31868671ed0d686a754c0f343b03025e10e254a816b76a063963b01 not found: ID does not exist" Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.921262 4975 scope.go:117] "RemoveContainer" containerID="f8c79b93a20b1940e87ecba5ebc5af1010f716dd5520421976394eb7dd4f4bed" Mar 18 13:44:03 crc kubenswrapper[4975]: E0318 13:44:03.921584 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8c79b93a20b1940e87ecba5ebc5af1010f716dd5520421976394eb7dd4f4bed\": container with ID starting with f8c79b93a20b1940e87ecba5ebc5af1010f716dd5520421976394eb7dd4f4bed not found: ID does not exist" containerID="f8c79b93a20b1940e87ecba5ebc5af1010f716dd5520421976394eb7dd4f4bed" Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.921614 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8c79b93a20b1940e87ecba5ebc5af1010f716dd5520421976394eb7dd4f4bed"} err="failed to get container status \"f8c79b93a20b1940e87ecba5ebc5af1010f716dd5520421976394eb7dd4f4bed\": rpc error: code = NotFound desc = could not find container \"f8c79b93a20b1940e87ecba5ebc5af1010f716dd5520421976394eb7dd4f4bed\": container with ID starting with f8c79b93a20b1940e87ecba5ebc5af1010f716dd5520421976394eb7dd4f4bed not found: ID does not exist" Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.921631 4975 scope.go:117] "RemoveContainer" containerID="49a59e0374ae9a8e714d3840e253ab9d97b3436d35d01e89a7dfd74f7f831e43" Mar 18 13:44:03 crc kubenswrapper[4975]: E0318 13:44:03.921853 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49a59e0374ae9a8e714d3840e253ab9d97b3436d35d01e89a7dfd74f7f831e43\": container with ID starting with 49a59e0374ae9a8e714d3840e253ab9d97b3436d35d01e89a7dfd74f7f831e43 not found: ID does not exist" containerID="49a59e0374ae9a8e714d3840e253ab9d97b3436d35d01e89a7dfd74f7f831e43" Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.921896 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49a59e0374ae9a8e714d3840e253ab9d97b3436d35d01e89a7dfd74f7f831e43"} err="failed to get container status \"49a59e0374ae9a8e714d3840e253ab9d97b3436d35d01e89a7dfd74f7f831e43\": rpc error: code = NotFound desc = could not find container \"49a59e0374ae9a8e714d3840e253ab9d97b3436d35d01e89a7dfd74f7f831e43\": container with ID starting with 49a59e0374ae9a8e714d3840e253ab9d97b3436d35d01e89a7dfd74f7f831e43 not found: ID does not exist" Mar 18 13:44:03 crc kubenswrapper[4975]: I0318 13:44:03.975672 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:04 crc kubenswrapper[4975]: I0318 13:44:04.021848 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b56e320-a50d-4cf2-a6a8-400ba8e45d87" (UID: "0b56e320-a50d-4cf2-a6a8-400ba8e45d87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:44:04 crc kubenswrapper[4975]: I0318 13:44:04.077017 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b56e320-a50d-4cf2-a6a8-400ba8e45d87-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:04 crc kubenswrapper[4975]: I0318 13:44:04.125603 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl9mf"] Mar 18 13:44:04 crc kubenswrapper[4975]: I0318 13:44:04.137124 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wl9mf"] Mar 18 13:44:05 crc kubenswrapper[4975]: I0318 13:44:05.028637 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b56e320-a50d-4cf2-a6a8-400ba8e45d87" path="/var/lib/kubelet/pods/0b56e320-a50d-4cf2-a6a8-400ba8e45d87/volumes" Mar 18 13:44:05 crc kubenswrapper[4975]: I0318 13:44:05.093114 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-r7xsg" Mar 18 13:44:05 crc kubenswrapper[4975]: I0318 13:44:05.200074 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9wbd\" (UniqueName: \"kubernetes.io/projected/4eaab4a7-c1e6-4829-8a8f-ea44b17c7179-kube-api-access-g9wbd\") pod \"4eaab4a7-c1e6-4829-8a8f-ea44b17c7179\" (UID: \"4eaab4a7-c1e6-4829-8a8f-ea44b17c7179\") " Mar 18 13:44:05 crc kubenswrapper[4975]: I0318 13:44:05.206943 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eaab4a7-c1e6-4829-8a8f-ea44b17c7179-kube-api-access-g9wbd" (OuterVolumeSpecName: "kube-api-access-g9wbd") pod "4eaab4a7-c1e6-4829-8a8f-ea44b17c7179" (UID: "4eaab4a7-c1e6-4829-8a8f-ea44b17c7179"). InnerVolumeSpecName "kube-api-access-g9wbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:44:05 crc kubenswrapper[4975]: I0318 13:44:05.302799 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9wbd\" (UniqueName: \"kubernetes.io/projected/4eaab4a7-c1e6-4829-8a8f-ea44b17c7179-kube-api-access-g9wbd\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:05 crc kubenswrapper[4975]: I0318 13:44:05.820582 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564024-r7xsg" event={"ID":"4eaab4a7-c1e6-4829-8a8f-ea44b17c7179","Type":"ContainerDied","Data":"a230901b1ef94da32451c68a864c3d2ab7ec70f780b827e993e80aaa6886eaf9"} Mar 18 13:44:05 crc kubenswrapper[4975]: I0318 13:44:05.820631 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a230901b1ef94da32451c68a864c3d2ab7ec70f780b827e993e80aaa6886eaf9" Mar 18 13:44:05 crc kubenswrapper[4975]: I0318 13:44:05.820718 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564024-r7xsg" Mar 18 13:44:05 crc kubenswrapper[4975]: I0318 13:44:05.876521 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-pgqzn"] Mar 18 13:44:05 crc kubenswrapper[4975]: I0318 13:44:05.886708 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564018-pgqzn"] Mar 18 13:44:07 crc kubenswrapper[4975]: I0318 13:44:07.029331 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8" path="/var/lib/kubelet/pods/c8a6a6fc-a2e8-44dc-ba80-6c63e9e74ad8/volumes" Mar 18 13:44:20 crc kubenswrapper[4975]: I0318 13:44:20.964359 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9wn9r"] Mar 18 13:44:20 crc kubenswrapper[4975]: E0318 13:44:20.965409 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b56e320-a50d-4cf2-a6a8-400ba8e45d87" containerName="registry-server" Mar 18 13:44:20 crc kubenswrapper[4975]: I0318 13:44:20.965426 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b56e320-a50d-4cf2-a6a8-400ba8e45d87" containerName="registry-server" Mar 18 13:44:20 crc kubenswrapper[4975]: E0318 13:44:20.965441 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b56e320-a50d-4cf2-a6a8-400ba8e45d87" containerName="extract-utilities" Mar 18 13:44:20 crc kubenswrapper[4975]: I0318 13:44:20.965447 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b56e320-a50d-4cf2-a6a8-400ba8e45d87" containerName="extract-utilities" Mar 18 13:44:20 crc kubenswrapper[4975]: E0318 13:44:20.965461 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b56e320-a50d-4cf2-a6a8-400ba8e45d87" containerName="extract-content" Mar 18 13:44:20 crc kubenswrapper[4975]: I0318 13:44:20.965467 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b56e320-a50d-4cf2-a6a8-400ba8e45d87" containerName="extract-content" Mar 18 13:44:20 crc kubenswrapper[4975]: E0318 13:44:20.965494 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eaab4a7-c1e6-4829-8a8f-ea44b17c7179" containerName="oc" Mar 18 13:44:20 crc kubenswrapper[4975]: I0318 13:44:20.965502 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eaab4a7-c1e6-4829-8a8f-ea44b17c7179" containerName="oc" Mar 18 13:44:20 crc kubenswrapper[4975]: I0318 13:44:20.965703 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eaab4a7-c1e6-4829-8a8f-ea44b17c7179" containerName="oc" Mar 18 13:44:20 crc kubenswrapper[4975]: I0318 13:44:20.965736 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b56e320-a50d-4cf2-a6a8-400ba8e45d87" containerName="registry-server" Mar 18 13:44:20 crc kubenswrapper[4975]: I0318 13:44:20.969247 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:21 crc kubenswrapper[4975]: I0318 13:44:21.004516 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9wn9r"] Mar 18 13:44:21 crc kubenswrapper[4975]: I0318 13:44:21.117994 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-utilities\") pod \"certified-operators-9wn9r\" (UID: \"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1\") " pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:21 crc kubenswrapper[4975]: I0318 13:44:21.118532 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-catalog-content\") pod \"certified-operators-9wn9r\" (UID: \"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1\") " pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:21 crc kubenswrapper[4975]: I0318 13:44:21.119071 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrc5f\" (UniqueName: \"kubernetes.io/projected/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-kube-api-access-vrc5f\") pod \"certified-operators-9wn9r\" (UID: \"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1\") " pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:21 crc kubenswrapper[4975]: I0318 13:44:21.221398 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-utilities\") pod \"certified-operators-9wn9r\" (UID: \"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1\") " pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:21 crc kubenswrapper[4975]: I0318 13:44:21.221547 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-catalog-content\") pod \"certified-operators-9wn9r\" (UID: \"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1\") " pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:21 crc kubenswrapper[4975]: I0318 13:44:21.221619 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrc5f\" (UniqueName: \"kubernetes.io/projected/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-kube-api-access-vrc5f\") pod \"certified-operators-9wn9r\" (UID: \"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1\") " pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:21 crc kubenswrapper[4975]: I0318 13:44:21.222182 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-utilities\") pod \"certified-operators-9wn9r\" (UID: \"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1\") " pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:21 crc kubenswrapper[4975]: I0318 13:44:21.222550 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-catalog-content\") pod \"certified-operators-9wn9r\" (UID: \"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1\") " pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:21 crc kubenswrapper[4975]: I0318 13:44:21.246970 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrc5f\" (UniqueName: \"kubernetes.io/projected/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-kube-api-access-vrc5f\") pod \"certified-operators-9wn9r\" (UID: \"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1\") " pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:21 crc kubenswrapper[4975]: I0318 13:44:21.315491 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:21 crc kubenswrapper[4975]: W0318 13:44:21.852682 4975 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dcf780a_a93d_4dac_bd09_eddc1f5d4bd1.slice/crio-86af98bd3ba70018442e2d452c083ea6354ab9a91a93d1ac232f9ebfeda2ec53 WatchSource:0}: Error finding container 86af98bd3ba70018442e2d452c083ea6354ab9a91a93d1ac232f9ebfeda2ec53: Status 404 returned error can't find the container with id 86af98bd3ba70018442e2d452c083ea6354ab9a91a93d1ac232f9ebfeda2ec53 Mar 18 13:44:21 crc kubenswrapper[4975]: I0318 13:44:21.859186 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9wn9r"] Mar 18 13:44:21 crc kubenswrapper[4975]: I0318 13:44:21.978990 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wn9r" event={"ID":"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1","Type":"ContainerStarted","Data":"86af98bd3ba70018442e2d452c083ea6354ab9a91a93d1ac232f9ebfeda2ec53"} Mar 18 13:44:22 crc kubenswrapper[4975]: I0318 13:44:22.988942 4975 generic.go:334] "Generic (PLEG): container finished" podID="1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1" containerID="c3e3176c7b4ff9be8e1d80af424ca30898ad938029f639edbd361a8043058c92" exitCode=0 Mar 18 13:44:22 crc kubenswrapper[4975]: I0318 13:44:22.988985 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wn9r" event={"ID":"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1","Type":"ContainerDied","Data":"c3e3176c7b4ff9be8e1d80af424ca30898ad938029f639edbd361a8043058c92"} Mar 18 13:44:25 crc kubenswrapper[4975]: I0318 13:44:25.011405 4975 generic.go:334] "Generic (PLEG): container finished" podID="1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1" containerID="b5d7eb723e04c77eae211970c9c9d83933e8a8a38809eafd8367f9675a862054" exitCode=0 Mar 18 13:44:25 crc kubenswrapper[4975]: I0318 13:44:25.011586 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wn9r" event={"ID":"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1","Type":"ContainerDied","Data":"b5d7eb723e04c77eae211970c9c9d83933e8a8a38809eafd8367f9675a862054"} Mar 18 13:44:26 crc kubenswrapper[4975]: I0318 13:44:26.021674 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wn9r" event={"ID":"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1","Type":"ContainerStarted","Data":"bd0de2307b21b5186847d6b61da7a713be5bf83c9f4e0ace1dc09077353e4692"} Mar 18 13:44:26 crc kubenswrapper[4975]: I0318 13:44:26.043713 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9wn9r" podStartSLOduration=3.468676928 podStartE2EDuration="6.043690672s" podCreationTimestamp="2026-03-18 13:44:20 +0000 UTC" firstStartedPulling="2026-03-18 13:44:22.991253925 +0000 UTC m=+5648.705654504" lastFinishedPulling="2026-03-18 13:44:25.566267669 +0000 UTC m=+5651.280668248" observedRunningTime="2026-03-18 13:44:26.039632932 +0000 UTC m=+5651.754033511" watchObservedRunningTime="2026-03-18 13:44:26.043690672 +0000 UTC m=+5651.758091251" Mar 18 13:44:31 crc kubenswrapper[4975]: I0318 13:44:31.316455 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:31 crc kubenswrapper[4975]: I0318 13:44:31.318646 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:31 crc kubenswrapper[4975]: I0318 13:44:31.369929 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:32 crc kubenswrapper[4975]: I0318 13:44:32.116967 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:32 crc kubenswrapper[4975]: I0318 13:44:32.169662 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9wn9r"] Mar 18 13:44:34 crc kubenswrapper[4975]: I0318 13:44:34.094261 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9wn9r" podUID="1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1" containerName="registry-server" containerID="cri-o://bd0de2307b21b5186847d6b61da7a713be5bf83c9f4e0ace1dc09077353e4692" gracePeriod=2 Mar 18 13:44:34 crc kubenswrapper[4975]: I0318 13:44:34.540544 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:34 crc kubenswrapper[4975]: I0318 13:44:34.687536 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-catalog-content\") pod \"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1\" (UID: \"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1\") " Mar 18 13:44:34 crc kubenswrapper[4975]: I0318 13:44:34.687672 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrc5f\" (UniqueName: \"kubernetes.io/projected/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-kube-api-access-vrc5f\") pod \"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1\" (UID: \"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1\") " Mar 18 13:44:34 crc kubenswrapper[4975]: I0318 13:44:34.687717 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-utilities\") pod \"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1\" (UID: \"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1\") " Mar 18 13:44:34 crc kubenswrapper[4975]: I0318 13:44:34.688697 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-utilities" (OuterVolumeSpecName: "utilities") pod "1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1" (UID: "1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:44:34 crc kubenswrapper[4975]: I0318 13:44:34.693699 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-kube-api-access-vrc5f" (OuterVolumeSpecName: "kube-api-access-vrc5f") pod "1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1" (UID: "1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1"). InnerVolumeSpecName "kube-api-access-vrc5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:44:34 crc kubenswrapper[4975]: I0318 13:44:34.745540 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1" (UID: "1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:44:34 crc kubenswrapper[4975]: I0318 13:44:34.790443 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:34 crc kubenswrapper[4975]: I0318 13:44:34.790498 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrc5f\" (UniqueName: \"kubernetes.io/projected/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-kube-api-access-vrc5f\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:34 crc kubenswrapper[4975]: I0318 13:44:34.790512 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:35 crc kubenswrapper[4975]: I0318 13:44:35.107947 4975 generic.go:334] "Generic (PLEG): container finished" podID="1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1" containerID="bd0de2307b21b5186847d6b61da7a713be5bf83c9f4e0ace1dc09077353e4692" exitCode=0 Mar 18 13:44:35 crc kubenswrapper[4975]: I0318 13:44:35.108020 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wn9r" event={"ID":"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1","Type":"ContainerDied","Data":"bd0de2307b21b5186847d6b61da7a713be5bf83c9f4e0ace1dc09077353e4692"} Mar 18 13:44:35 crc kubenswrapper[4975]: I0318 13:44:35.108079 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wn9r" Mar 18 13:44:35 crc kubenswrapper[4975]: I0318 13:44:35.108149 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wn9r" event={"ID":"1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1","Type":"ContainerDied","Data":"86af98bd3ba70018442e2d452c083ea6354ab9a91a93d1ac232f9ebfeda2ec53"} Mar 18 13:44:35 crc kubenswrapper[4975]: I0318 13:44:35.108181 4975 scope.go:117] "RemoveContainer" containerID="bd0de2307b21b5186847d6b61da7a713be5bf83c9f4e0ace1dc09077353e4692" Mar 18 13:44:35 crc kubenswrapper[4975]: I0318 13:44:35.134107 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9wn9r"] Mar 18 13:44:35 crc kubenswrapper[4975]: I0318 13:44:35.136474 4975 scope.go:117] "RemoveContainer" containerID="b5d7eb723e04c77eae211970c9c9d83933e8a8a38809eafd8367f9675a862054" Mar 18 13:44:35 crc kubenswrapper[4975]: I0318 13:44:35.144490 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9wn9r"] Mar 18 13:44:35 crc kubenswrapper[4975]: I0318 13:44:35.157127 4975 scope.go:117] "RemoveContainer" containerID="c3e3176c7b4ff9be8e1d80af424ca30898ad938029f639edbd361a8043058c92" Mar 18 13:44:35 crc kubenswrapper[4975]: I0318 13:44:35.200838 4975 scope.go:117] "RemoveContainer" containerID="bd0de2307b21b5186847d6b61da7a713be5bf83c9f4e0ace1dc09077353e4692" Mar 18 13:44:35 crc kubenswrapper[4975]: E0318 13:44:35.201584 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd0de2307b21b5186847d6b61da7a713be5bf83c9f4e0ace1dc09077353e4692\": container with ID starting with bd0de2307b21b5186847d6b61da7a713be5bf83c9f4e0ace1dc09077353e4692 not found: ID does not exist" containerID="bd0de2307b21b5186847d6b61da7a713be5bf83c9f4e0ace1dc09077353e4692" Mar 18 13:44:35 crc kubenswrapper[4975]: I0318 13:44:35.201628 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0de2307b21b5186847d6b61da7a713be5bf83c9f4e0ace1dc09077353e4692"} err="failed to get container status \"bd0de2307b21b5186847d6b61da7a713be5bf83c9f4e0ace1dc09077353e4692\": rpc error: code = NotFound desc = could not find container \"bd0de2307b21b5186847d6b61da7a713be5bf83c9f4e0ace1dc09077353e4692\": container with ID starting with bd0de2307b21b5186847d6b61da7a713be5bf83c9f4e0ace1dc09077353e4692 not found: ID does not exist" Mar 18 13:44:35 crc kubenswrapper[4975]: I0318 13:44:35.201656 4975 scope.go:117] "RemoveContainer" containerID="b5d7eb723e04c77eae211970c9c9d83933e8a8a38809eafd8367f9675a862054" Mar 18 13:44:35 crc kubenswrapper[4975]: E0318 13:44:35.202661 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5d7eb723e04c77eae211970c9c9d83933e8a8a38809eafd8367f9675a862054\": container with ID starting with b5d7eb723e04c77eae211970c9c9d83933e8a8a38809eafd8367f9675a862054 not found: ID does not exist" containerID="b5d7eb723e04c77eae211970c9c9d83933e8a8a38809eafd8367f9675a862054" Mar 18 13:44:35 crc kubenswrapper[4975]: I0318 13:44:35.202703 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5d7eb723e04c77eae211970c9c9d83933e8a8a38809eafd8367f9675a862054"} err="failed to get container status \"b5d7eb723e04c77eae211970c9c9d83933e8a8a38809eafd8367f9675a862054\": rpc error: code = NotFound desc = could not find container \"b5d7eb723e04c77eae211970c9c9d83933e8a8a38809eafd8367f9675a862054\": container with ID starting with b5d7eb723e04c77eae211970c9c9d83933e8a8a38809eafd8367f9675a862054 not found: ID does not exist" Mar 18 13:44:35 crc kubenswrapper[4975]: I0318 13:44:35.202733 4975 scope.go:117] "RemoveContainer" containerID="c3e3176c7b4ff9be8e1d80af424ca30898ad938029f639edbd361a8043058c92" Mar 18 13:44:35 crc kubenswrapper[4975]: E0318 13:44:35.203125 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3e3176c7b4ff9be8e1d80af424ca30898ad938029f639edbd361a8043058c92\": container with ID starting with c3e3176c7b4ff9be8e1d80af424ca30898ad938029f639edbd361a8043058c92 not found: ID does not exist" containerID="c3e3176c7b4ff9be8e1d80af424ca30898ad938029f639edbd361a8043058c92" Mar 18 13:44:35 crc kubenswrapper[4975]: I0318 13:44:35.203156 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3e3176c7b4ff9be8e1d80af424ca30898ad938029f639edbd361a8043058c92"} err="failed to get container status \"c3e3176c7b4ff9be8e1d80af424ca30898ad938029f639edbd361a8043058c92\": rpc error: code = NotFound desc = could not find container \"c3e3176c7b4ff9be8e1d80af424ca30898ad938029f639edbd361a8043058c92\": container with ID starting with c3e3176c7b4ff9be8e1d80af424ca30898ad938029f639edbd361a8043058c92 not found: ID does not exist" Mar 18 13:44:37 crc kubenswrapper[4975]: I0318 13:44:37.027745 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1" path="/var/lib/kubelet/pods/1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1/volumes" Mar 18 13:44:38 crc kubenswrapper[4975]: I0318 13:44:38.138620 4975 generic.go:334] "Generic (PLEG): container finished" podID="c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4" containerID="9e11e0919b24d2816d47f9184abb539d73c7e076ad371ef0023458a28c5769ef" exitCode=0 Mar 18 13:44:38 crc kubenswrapper[4975]: I0318 13:44:38.138695 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zv9b4/must-gather-njdn8" event={"ID":"c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4","Type":"ContainerDied","Data":"9e11e0919b24d2816d47f9184abb539d73c7e076ad371ef0023458a28c5769ef"} Mar 18 13:44:38 crc kubenswrapper[4975]: I0318 13:44:38.139719 4975 scope.go:117] "RemoveContainer" containerID="9e11e0919b24d2816d47f9184abb539d73c7e076ad371ef0023458a28c5769ef" Mar 18 13:44:38 crc kubenswrapper[4975]: I0318 13:44:38.590572 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zv9b4_must-gather-njdn8_c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4/gather/0.log" Mar 18 13:44:47 crc kubenswrapper[4975]: I0318 13:44:47.170463 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zv9b4/must-gather-njdn8"] Mar 18 13:44:47 crc kubenswrapper[4975]: I0318 13:44:47.171481 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-zv9b4/must-gather-njdn8" podUID="c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4" containerName="copy" containerID="cri-o://100fcbe7cf8ba8b57e0f9cd5862ddae41cd84b074891bb155a1775335fbc8db8" gracePeriod=2 Mar 18 13:44:47 crc kubenswrapper[4975]: I0318 13:44:47.181160 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zv9b4/must-gather-njdn8"] Mar 18 13:44:47 crc kubenswrapper[4975]: I0318 13:44:47.610463 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zv9b4_must-gather-njdn8_c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4/copy/0.log" Mar 18 13:44:47 crc kubenswrapper[4975]: I0318 13:44:47.611265 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zv9b4/must-gather-njdn8" Mar 18 13:44:47 crc kubenswrapper[4975]: I0318 13:44:47.756249 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4-must-gather-output\") pod \"c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4\" (UID: \"c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4\") " Mar 18 13:44:47 crc kubenswrapper[4975]: I0318 13:44:47.756995 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkwgq\" (UniqueName: \"kubernetes.io/projected/c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4-kube-api-access-fkwgq\") pod \"c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4\" (UID: \"c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4\") " Mar 18 13:44:47 crc kubenswrapper[4975]: I0318 13:44:47.764191 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4-kube-api-access-fkwgq" (OuterVolumeSpecName: "kube-api-access-fkwgq") pod "c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4" (UID: "c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4"). InnerVolumeSpecName "kube-api-access-fkwgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:44:47 crc kubenswrapper[4975]: I0318 13:44:47.860343 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkwgq\" (UniqueName: \"kubernetes.io/projected/c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4-kube-api-access-fkwgq\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:47 crc kubenswrapper[4975]: I0318 13:44:47.948664 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4" (UID: "c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:44:47 crc kubenswrapper[4975]: I0318 13:44:47.961370 4975 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 13:44:48 crc kubenswrapper[4975]: I0318 13:44:48.236444 4975 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zv9b4_must-gather-njdn8_c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4/copy/0.log" Mar 18 13:44:48 crc kubenswrapper[4975]: I0318 13:44:48.237368 4975 generic.go:334] "Generic (PLEG): container finished" podID="c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4" containerID="100fcbe7cf8ba8b57e0f9cd5862ddae41cd84b074891bb155a1775335fbc8db8" exitCode=143 Mar 18 13:44:48 crc kubenswrapper[4975]: I0318 13:44:48.237427 4975 scope.go:117] "RemoveContainer" containerID="100fcbe7cf8ba8b57e0f9cd5862ddae41cd84b074891bb155a1775335fbc8db8" Mar 18 13:44:48 crc kubenswrapper[4975]: I0318 13:44:48.237456 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zv9b4/must-gather-njdn8" Mar 18 13:44:48 crc kubenswrapper[4975]: I0318 13:44:48.269828 4975 scope.go:117] "RemoveContainer" containerID="9e11e0919b24d2816d47f9184abb539d73c7e076ad371ef0023458a28c5769ef" Mar 18 13:44:48 crc kubenswrapper[4975]: I0318 13:44:48.373858 4975 scope.go:117] "RemoveContainer" containerID="100fcbe7cf8ba8b57e0f9cd5862ddae41cd84b074891bb155a1775335fbc8db8" Mar 18 13:44:48 crc kubenswrapper[4975]: E0318 13:44:48.374620 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"100fcbe7cf8ba8b57e0f9cd5862ddae41cd84b074891bb155a1775335fbc8db8\": container with ID starting with 100fcbe7cf8ba8b57e0f9cd5862ddae41cd84b074891bb155a1775335fbc8db8 not found: ID does not exist" containerID="100fcbe7cf8ba8b57e0f9cd5862ddae41cd84b074891bb155a1775335fbc8db8" Mar 18 13:44:48 crc kubenswrapper[4975]: I0318 13:44:48.374736 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"100fcbe7cf8ba8b57e0f9cd5862ddae41cd84b074891bb155a1775335fbc8db8"} err="failed to get container status \"100fcbe7cf8ba8b57e0f9cd5862ddae41cd84b074891bb155a1775335fbc8db8\": rpc error: code = NotFound desc = could not find container \"100fcbe7cf8ba8b57e0f9cd5862ddae41cd84b074891bb155a1775335fbc8db8\": container with ID starting with 100fcbe7cf8ba8b57e0f9cd5862ddae41cd84b074891bb155a1775335fbc8db8 not found: ID does not exist" Mar 18 13:44:48 crc kubenswrapper[4975]: I0318 13:44:48.374891 4975 scope.go:117] "RemoveContainer" containerID="9e11e0919b24d2816d47f9184abb539d73c7e076ad371ef0023458a28c5769ef" Mar 18 13:44:48 crc kubenswrapper[4975]: E0318 13:44:48.376009 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e11e0919b24d2816d47f9184abb539d73c7e076ad371ef0023458a28c5769ef\": container with ID starting with 9e11e0919b24d2816d47f9184abb539d73c7e076ad371ef0023458a28c5769ef not found: ID does not exist" containerID="9e11e0919b24d2816d47f9184abb539d73c7e076ad371ef0023458a28c5769ef" Mar 18 13:44:48 crc kubenswrapper[4975]: I0318 13:44:48.376105 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e11e0919b24d2816d47f9184abb539d73c7e076ad371ef0023458a28c5769ef"} err="failed to get container status \"9e11e0919b24d2816d47f9184abb539d73c7e076ad371ef0023458a28c5769ef\": rpc error: code = NotFound desc = could not find container \"9e11e0919b24d2816d47f9184abb539d73c7e076ad371ef0023458a28c5769ef\": container with ID starting with 9e11e0919b24d2816d47f9184abb539d73c7e076ad371ef0023458a28c5769ef not found: ID does not exist" Mar 18 13:44:49 crc kubenswrapper[4975]: I0318 13:44:49.034783 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4" path="/var/lib/kubelet/pods/c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4/volumes" Mar 18 13:44:54 crc kubenswrapper[4975]: I0318 13:44:54.570933 4975 scope.go:117] "RemoveContainer" containerID="688923cd39a233f335ad5d00f585bd1c4e58d71015c82e2a7b683ac73969ffbd" Mar 18 13:44:54 crc kubenswrapper[4975]: I0318 13:44:54.595677 4975 scope.go:117] "RemoveContainer" containerID="f0d80b42771e33fac4c6daeb9c7eea4caf5c8d466617b289ae41f4b50eb9a5c6" Mar 18 13:44:55 crc kubenswrapper[4975]: I0318 13:44:55.538490 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:44:55 crc kubenswrapper[4975]: I0318 13:44:55.538878 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.163464 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw"] Mar 18 13:45:00 crc kubenswrapper[4975]: E0318 13:45:00.170468 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1" containerName="extract-utilities" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.170669 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1" containerName="extract-utilities" Mar 18 13:45:00 crc kubenswrapper[4975]: E0318 13:45:00.170749 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4" containerName="copy" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.170801 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4" containerName="copy" Mar 18 13:45:00 crc kubenswrapper[4975]: E0318 13:45:00.170924 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4" containerName="gather" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.170999 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4" containerName="gather" Mar 18 13:45:00 crc kubenswrapper[4975]: E0318 13:45:00.171070 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1" containerName="registry-server" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.171124 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1" containerName="registry-server" Mar 18 13:45:00 crc kubenswrapper[4975]: E0318 13:45:00.171194 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1" containerName="extract-content" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.171249 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1" containerName="extract-content" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.171571 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dcf780a-a93d-4dac-bd09-eddc1f5d4bd1" containerName="registry-server" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.171666 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4" containerName="gather" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.171730 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18d2ec0-a5e6-4bd0-ad6b-bfce71f101b4" containerName="copy" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.172658 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.175485 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.175577 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.182745 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw"] Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.296263 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s48x7\" (UniqueName: \"kubernetes.io/projected/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-kube-api-access-s48x7\") pod \"collect-profiles-29564025-vmcgw\" (UID: \"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.296947 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-secret-volume\") pod \"collect-profiles-29564025-vmcgw\" (UID: \"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.297311 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-config-volume\") pod \"collect-profiles-29564025-vmcgw\" (UID: \"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.399507 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s48x7\" (UniqueName: \"kubernetes.io/projected/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-kube-api-access-s48x7\") pod \"collect-profiles-29564025-vmcgw\" (UID: \"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.399600 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-secret-volume\") pod \"collect-profiles-29564025-vmcgw\" (UID: \"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.399677 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-config-volume\") pod \"collect-profiles-29564025-vmcgw\" (UID: \"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.400664 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-config-volume\") pod \"collect-profiles-29564025-vmcgw\" (UID: \"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.407148 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-secret-volume\") pod \"collect-profiles-29564025-vmcgw\" (UID: \"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.417910 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s48x7\" (UniqueName: \"kubernetes.io/projected/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-kube-api-access-s48x7\") pod \"collect-profiles-29564025-vmcgw\" (UID: \"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.494687 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" Mar 18 13:45:00 crc kubenswrapper[4975]: I0318 13:45:00.961559 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw"] Mar 18 13:45:01 crc kubenswrapper[4975]: I0318 13:45:01.384770 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" event={"ID":"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0","Type":"ContainerStarted","Data":"c4baf3db98d3abe5c0ccfd26bedcdaf22719a81b1be1506ac3ede2ae07ff3d11"} Mar 18 13:45:01 crc kubenswrapper[4975]: I0318 13:45:01.385123 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" event={"ID":"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0","Type":"ContainerStarted","Data":"324ed01038f5fc2e26ff94199155817edbfb33dbd875a3fada515c535cb72844"} Mar 18 13:45:01 crc kubenswrapper[4975]: I0318 13:45:01.409069 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" podStartSLOduration=1.409040277 podStartE2EDuration="1.409040277s" podCreationTimestamp="2026-03-18 13:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 13:45:01.404605607 +0000 UTC m=+5687.119006196" watchObservedRunningTime="2026-03-18 13:45:01.409040277 +0000 UTC m=+5687.123440886" Mar 18 13:45:02 crc kubenswrapper[4975]: I0318 13:45:02.396277 4975 generic.go:334] "Generic (PLEG): container finished" podID="2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0" containerID="c4baf3db98d3abe5c0ccfd26bedcdaf22719a81b1be1506ac3ede2ae07ff3d11" exitCode=0 Mar 18 13:45:02 crc kubenswrapper[4975]: I0318 13:45:02.396363 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" event={"ID":"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0","Type":"ContainerDied","Data":"c4baf3db98d3abe5c0ccfd26bedcdaf22719a81b1be1506ac3ede2ae07ff3d11"} Mar 18 13:45:03 crc kubenswrapper[4975]: I0318 13:45:03.714784 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" Mar 18 13:45:03 crc kubenswrapper[4975]: I0318 13:45:03.870217 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-secret-volume\") pod \"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0\" (UID: \"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0\") " Mar 18 13:45:03 crc kubenswrapper[4975]: I0318 13:45:03.870389 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s48x7\" (UniqueName: \"kubernetes.io/projected/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-kube-api-access-s48x7\") pod \"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0\" (UID: \"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0\") " Mar 18 13:45:03 crc kubenswrapper[4975]: I0318 13:45:03.870422 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-config-volume\") pod \"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0\" (UID: \"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0\") " Mar 18 13:45:03 crc kubenswrapper[4975]: I0318 13:45:03.871317 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-config-volume" (OuterVolumeSpecName: "config-volume") pod "2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0" (UID: "2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 13:45:03 crc kubenswrapper[4975]: I0318 13:45:03.877473 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-kube-api-access-s48x7" (OuterVolumeSpecName: "kube-api-access-s48x7") pod "2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0" (UID: "2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0"). InnerVolumeSpecName "kube-api-access-s48x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:03 crc kubenswrapper[4975]: I0318 13:45:03.877722 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0" (UID: "2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 13:45:03 crc kubenswrapper[4975]: I0318 13:45:03.972676 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s48x7\" (UniqueName: \"kubernetes.io/projected/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-kube-api-access-s48x7\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:03 crc kubenswrapper[4975]: I0318 13:45:03.972721 4975 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:03 crc kubenswrapper[4975]: I0318 13:45:03.972733 4975 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:04 crc kubenswrapper[4975]: I0318 13:45:04.415022 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" event={"ID":"2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0","Type":"ContainerDied","Data":"324ed01038f5fc2e26ff94199155817edbfb33dbd875a3fada515c535cb72844"} Mar 18 13:45:04 crc kubenswrapper[4975]: I0318 13:45:04.415082 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="324ed01038f5fc2e26ff94199155817edbfb33dbd875a3fada515c535cb72844" Mar 18 13:45:04 crc kubenswrapper[4975]: I0318 13:45:04.415056 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564025-vmcgw" Mar 18 13:45:04 crc kubenswrapper[4975]: I0318 13:45:04.477348 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp"] Mar 18 13:45:04 crc kubenswrapper[4975]: I0318 13:45:04.485583 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563980-485pp"] Mar 18 13:45:05 crc kubenswrapper[4975]: I0318 13:45:05.039850 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35843f4f-db77-4a4b-9a62-173a4884d774" path="/var/lib/kubelet/pods/35843f4f-db77-4a4b-9a62-173a4884d774/volumes" Mar 18 13:45:25 crc kubenswrapper[4975]: I0318 13:45:25.539012 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:45:25 crc kubenswrapper[4975]: I0318 13:45:25.539612 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.090316 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9hlzf"] Mar 18 13:45:26 crc kubenswrapper[4975]: E0318 13:45:26.090914 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0" containerName="collect-profiles" Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.090938 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0" containerName="collect-profiles" Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.091184 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c6ca486-0cfc-4cc0-b7ec-a64c1c997df0" containerName="collect-profiles" Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.092967 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.102253 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9hlzf"] Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.184132 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-utilities\") pod \"redhat-operators-9hlzf\" (UID: \"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25\") " pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.184203 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-catalog-content\") pod \"redhat-operators-9hlzf\" (UID: \"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25\") " pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.184288 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcnsj\" (UniqueName: \"kubernetes.io/projected/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-kube-api-access-pcnsj\") pod \"redhat-operators-9hlzf\" (UID: \"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25\") " pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.286098 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-utilities\") pod \"redhat-operators-9hlzf\" (UID: \"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25\") " pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.286152 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-catalog-content\") pod \"redhat-operators-9hlzf\" (UID: \"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25\") " pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.286217 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcnsj\" (UniqueName: \"kubernetes.io/projected/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-kube-api-access-pcnsj\") pod \"redhat-operators-9hlzf\" (UID: \"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25\") " pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.286727 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-utilities\") pod \"redhat-operators-9hlzf\" (UID: \"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25\") " pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.286977 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-catalog-content\") pod \"redhat-operators-9hlzf\" (UID: \"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25\") " pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.308741 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcnsj\" (UniqueName: \"kubernetes.io/projected/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-kube-api-access-pcnsj\") pod \"redhat-operators-9hlzf\" (UID: \"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25\") " pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.413536 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:26 crc kubenswrapper[4975]: I0318 13:45:26.917232 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9hlzf"] Mar 18 13:45:27 crc kubenswrapper[4975]: I0318 13:45:27.629617 4975 generic.go:334] "Generic (PLEG): container finished" podID="8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" containerID="7a600e23d217845659151687d939c8a1569ed19be829f13db76b56fdcf4e6507" exitCode=0 Mar 18 13:45:27 crc kubenswrapper[4975]: I0318 13:45:27.629712 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hlzf" event={"ID":"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25","Type":"ContainerDied","Data":"7a600e23d217845659151687d939c8a1569ed19be829f13db76b56fdcf4e6507"} Mar 18 13:45:27 crc kubenswrapper[4975]: I0318 13:45:27.630331 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hlzf" event={"ID":"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25","Type":"ContainerStarted","Data":"f6c423e25fc50a933bbc94234bec4718d870d38f9d6ee5ad69ad9720dc9e39d3"} Mar 18 13:45:29 crc kubenswrapper[4975]: I0318 13:45:29.647256 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hlzf" event={"ID":"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25","Type":"ContainerStarted","Data":"8062a3688f214ff55e76b7b01e7e3683429b595d28eddc2213e9673d0ee313b2"} Mar 18 13:45:30 crc kubenswrapper[4975]: I0318 13:45:30.659905 4975 generic.go:334] "Generic (PLEG): container finished" podID="8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" containerID="8062a3688f214ff55e76b7b01e7e3683429b595d28eddc2213e9673d0ee313b2" exitCode=0 Mar 18 13:45:30 crc kubenswrapper[4975]: I0318 13:45:30.659948 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hlzf" event={"ID":"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25","Type":"ContainerDied","Data":"8062a3688f214ff55e76b7b01e7e3683429b595d28eddc2213e9673d0ee313b2"} Mar 18 13:45:31 crc kubenswrapper[4975]: I0318 13:45:31.674297 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hlzf" event={"ID":"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25","Type":"ContainerStarted","Data":"75a1343a67337d071f5b7a95a9f6ecfcb582cc88180be1a7390624deeba6f784"} Mar 18 13:45:31 crc kubenswrapper[4975]: I0318 13:45:31.698671 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9hlzf" podStartSLOduration=2.200076041 podStartE2EDuration="5.698635806s" podCreationTimestamp="2026-03-18 13:45:26 +0000 UTC" firstStartedPulling="2026-03-18 13:45:27.632566353 +0000 UTC m=+5713.346966932" lastFinishedPulling="2026-03-18 13:45:31.131126118 +0000 UTC m=+5716.845526697" observedRunningTime="2026-03-18 13:45:31.690008743 +0000 UTC m=+5717.404409322" watchObservedRunningTime="2026-03-18 13:45:31.698635806 +0000 UTC m=+5717.413036385" Mar 18 13:45:36 crc kubenswrapper[4975]: I0318 13:45:36.414445 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:36 crc kubenswrapper[4975]: I0318 13:45:36.415049 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:37 crc kubenswrapper[4975]: I0318 13:45:37.460644 4975 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9hlzf" podUID="8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" containerName="registry-server" probeResult="failure" output=< Mar 18 13:45:37 crc kubenswrapper[4975]: timeout: failed to connect service ":50051" within 1s Mar 18 13:45:37 crc kubenswrapper[4975]: > Mar 18 13:45:46 crc kubenswrapper[4975]: I0318 13:45:46.459611 4975 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:46 crc kubenswrapper[4975]: I0318 13:45:46.508521 4975 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:46 crc kubenswrapper[4975]: I0318 13:45:46.697243 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9hlzf"] Mar 18 13:45:47 crc kubenswrapper[4975]: I0318 13:45:47.820469 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9hlzf" podUID="8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" containerName="registry-server" containerID="cri-o://75a1343a67337d071f5b7a95a9f6ecfcb582cc88180be1a7390624deeba6f784" gracePeriod=2 Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.271660 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.323844 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-utilities\") pod \"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25\" (UID: \"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25\") " Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.324014 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-catalog-content\") pod \"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25\" (UID: \"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25\") " Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.324194 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcnsj\" (UniqueName: \"kubernetes.io/projected/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-kube-api-access-pcnsj\") pod \"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25\" (UID: \"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25\") " Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.325063 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-utilities" (OuterVolumeSpecName: "utilities") pod "8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" (UID: "8e7edac4-8c9c-425c-8e2b-d7a70a70bf25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.329992 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-kube-api-access-pcnsj" (OuterVolumeSpecName: "kube-api-access-pcnsj") pod "8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" (UID: "8e7edac4-8c9c-425c-8e2b-d7a70a70bf25"). InnerVolumeSpecName "kube-api-access-pcnsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.426301 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcnsj\" (UniqueName: \"kubernetes.io/projected/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-kube-api-access-pcnsj\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.426330 4975 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.462014 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" (UID: "8e7edac4-8c9c-425c-8e2b-d7a70a70bf25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.528105 4975 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.830173 4975 generic.go:334] "Generic (PLEG): container finished" podID="8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" containerID="75a1343a67337d071f5b7a95a9f6ecfcb582cc88180be1a7390624deeba6f784" exitCode=0 Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.830239 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hlzf" event={"ID":"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25","Type":"ContainerDied","Data":"75a1343a67337d071f5b7a95a9f6ecfcb582cc88180be1a7390624deeba6f784"} Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.830300 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9hlzf" event={"ID":"8e7edac4-8c9c-425c-8e2b-d7a70a70bf25","Type":"ContainerDied","Data":"f6c423e25fc50a933bbc94234bec4718d870d38f9d6ee5ad69ad9720dc9e39d3"} Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.830253 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9hlzf" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.830359 4975 scope.go:117] "RemoveContainer" containerID="75a1343a67337d071f5b7a95a9f6ecfcb582cc88180be1a7390624deeba6f784" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.855081 4975 scope.go:117] "RemoveContainer" containerID="8062a3688f214ff55e76b7b01e7e3683429b595d28eddc2213e9673d0ee313b2" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.875829 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9hlzf"] Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.885541 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9hlzf"] Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.899063 4975 scope.go:117] "RemoveContainer" containerID="7a600e23d217845659151687d939c8a1569ed19be829f13db76b56fdcf4e6507" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.933931 4975 scope.go:117] "RemoveContainer" containerID="75a1343a67337d071f5b7a95a9f6ecfcb582cc88180be1a7390624deeba6f784" Mar 18 13:45:48 crc kubenswrapper[4975]: E0318 13:45:48.934291 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a1343a67337d071f5b7a95a9f6ecfcb582cc88180be1a7390624deeba6f784\": container with ID starting with 75a1343a67337d071f5b7a95a9f6ecfcb582cc88180be1a7390624deeba6f784 not found: ID does not exist" containerID="75a1343a67337d071f5b7a95a9f6ecfcb582cc88180be1a7390624deeba6f784" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.934324 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a1343a67337d071f5b7a95a9f6ecfcb582cc88180be1a7390624deeba6f784"} err="failed to get container status \"75a1343a67337d071f5b7a95a9f6ecfcb582cc88180be1a7390624deeba6f784\": rpc error: code = NotFound desc = could not find container \"75a1343a67337d071f5b7a95a9f6ecfcb582cc88180be1a7390624deeba6f784\": container with ID starting with 75a1343a67337d071f5b7a95a9f6ecfcb582cc88180be1a7390624deeba6f784 not found: ID does not exist" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.934344 4975 scope.go:117] "RemoveContainer" containerID="8062a3688f214ff55e76b7b01e7e3683429b595d28eddc2213e9673d0ee313b2" Mar 18 13:45:48 crc kubenswrapper[4975]: E0318 13:45:48.934555 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8062a3688f214ff55e76b7b01e7e3683429b595d28eddc2213e9673d0ee313b2\": container with ID starting with 8062a3688f214ff55e76b7b01e7e3683429b595d28eddc2213e9673d0ee313b2 not found: ID does not exist" containerID="8062a3688f214ff55e76b7b01e7e3683429b595d28eddc2213e9673d0ee313b2" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.934594 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8062a3688f214ff55e76b7b01e7e3683429b595d28eddc2213e9673d0ee313b2"} err="failed to get container status \"8062a3688f214ff55e76b7b01e7e3683429b595d28eddc2213e9673d0ee313b2\": rpc error: code = NotFound desc = could not find container \"8062a3688f214ff55e76b7b01e7e3683429b595d28eddc2213e9673d0ee313b2\": container with ID starting with 8062a3688f214ff55e76b7b01e7e3683429b595d28eddc2213e9673d0ee313b2 not found: ID does not exist" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.934620 4975 scope.go:117] "RemoveContainer" containerID="7a600e23d217845659151687d939c8a1569ed19be829f13db76b56fdcf4e6507" Mar 18 13:45:48 crc kubenswrapper[4975]: E0318 13:45:48.934823 4975 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a600e23d217845659151687d939c8a1569ed19be829f13db76b56fdcf4e6507\": container with ID starting with 7a600e23d217845659151687d939c8a1569ed19be829f13db76b56fdcf4e6507 not found: ID does not exist" containerID="7a600e23d217845659151687d939c8a1569ed19be829f13db76b56fdcf4e6507" Mar 18 13:45:48 crc kubenswrapper[4975]: I0318 13:45:48.934845 4975 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a600e23d217845659151687d939c8a1569ed19be829f13db76b56fdcf4e6507"} err="failed to get container status \"7a600e23d217845659151687d939c8a1569ed19be829f13db76b56fdcf4e6507\": rpc error: code = NotFound desc = could not find container \"7a600e23d217845659151687d939c8a1569ed19be829f13db76b56fdcf4e6507\": container with ID starting with 7a600e23d217845659151687d939c8a1569ed19be829f13db76b56fdcf4e6507 not found: ID does not exist" Mar 18 13:45:48 crc kubenswrapper[4975]: E0318 13:45:48.936948 4975 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7edac4_8c9c_425c_8e2b_d7a70a70bf25.slice\": RecentStats: unable to find data in memory cache]" Mar 18 13:45:49 crc kubenswrapper[4975]: I0318 13:45:49.027263 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" path="/var/lib/kubelet/pods/8e7edac4-8c9c-425c-8e2b-d7a70a70bf25/volumes" Mar 18 13:45:54 crc kubenswrapper[4975]: I0318 13:45:54.746096 4975 scope.go:117] "RemoveContainer" containerID="00375d6e0b9af6a4bf1ca4fafc2527b475fa0ed8cf7c2c036829780dc71f5136" Mar 18 13:45:55 crc kubenswrapper[4975]: I0318 13:45:55.539030 4975 patch_prober.go:28] interesting pod/machine-config-daemon-kvdzt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 13:45:55 crc kubenswrapper[4975]: I0318 13:45:55.539437 4975 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 13:45:55 crc kubenswrapper[4975]: I0318 13:45:55.539485 4975 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" Mar 18 13:45:55 crc kubenswrapper[4975]: I0318 13:45:55.540235 4975 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6"} pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 13:45:55 crc kubenswrapper[4975]: I0318 13:45:55.540305 4975 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerName="machine-config-daemon" containerID="cri-o://68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" gracePeriod=600 Mar 18 13:45:55 crc kubenswrapper[4975]: E0318 13:45:55.728901 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:45:55 crc kubenswrapper[4975]: I0318 13:45:55.912493 4975 generic.go:334] "Generic (PLEG): container finished" podID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" exitCode=0 Mar 18 13:45:55 crc kubenswrapper[4975]: I0318 13:45:55.912581 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" event={"ID":"59dd8f35-75c5-42d7-b11a-06586d1d5a1b","Type":"ContainerDied","Data":"68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6"} Mar 18 13:45:55 crc kubenswrapper[4975]: I0318 13:45:55.914015 4975 scope.go:117] "RemoveContainer" containerID="e4090f30bdc2c45c39660d9b590c2d973aeca4e258919be7c997a1fe3dc3e461" Mar 18 13:45:55 crc kubenswrapper[4975]: I0318 13:45:55.914797 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:45:55 crc kubenswrapper[4975]: E0318 13:45:55.915239 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:46:00 crc kubenswrapper[4975]: I0318 13:46:00.151579 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564026-srzj5"] Mar 18 13:46:00 crc kubenswrapper[4975]: E0318 13:46:00.154143 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" containerName="extract-utilities" Mar 18 13:46:00 crc kubenswrapper[4975]: I0318 13:46:00.154239 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" containerName="extract-utilities" Mar 18 13:46:00 crc kubenswrapper[4975]: E0318 13:46:00.154305 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" containerName="registry-server" Mar 18 13:46:00 crc kubenswrapper[4975]: I0318 13:46:00.154359 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" containerName="registry-server" Mar 18 13:46:00 crc kubenswrapper[4975]: E0318 13:46:00.154438 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" containerName="extract-content" Mar 18 13:46:00 crc kubenswrapper[4975]: I0318 13:46:00.154502 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" containerName="extract-content" Mar 18 13:46:00 crc kubenswrapper[4975]: I0318 13:46:00.154760 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e7edac4-8c9c-425c-8e2b-d7a70a70bf25" containerName="registry-server" Mar 18 13:46:00 crc kubenswrapper[4975]: I0318 13:46:00.155515 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-srzj5" Mar 18 13:46:00 crc kubenswrapper[4975]: I0318 13:46:00.159233 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:46:00 crc kubenswrapper[4975]: I0318 13:46:00.159639 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:46:00 crc kubenswrapper[4975]: I0318 13:46:00.159648 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:46:00 crc kubenswrapper[4975]: I0318 13:46:00.163028 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564026-srzj5"] Mar 18 13:46:00 crc kubenswrapper[4975]: I0318 13:46:00.263625 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrfs9\" (UniqueName: \"kubernetes.io/projected/329a8d0d-94f5-41d9-8459-69aca636869d-kube-api-access-wrfs9\") pod \"auto-csr-approver-29564026-srzj5\" (UID: \"329a8d0d-94f5-41d9-8459-69aca636869d\") " pod="openshift-infra/auto-csr-approver-29564026-srzj5" Mar 18 13:46:00 crc kubenswrapper[4975]: I0318 13:46:00.368093 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrfs9\" (UniqueName: \"kubernetes.io/projected/329a8d0d-94f5-41d9-8459-69aca636869d-kube-api-access-wrfs9\") pod \"auto-csr-approver-29564026-srzj5\" (UID: \"329a8d0d-94f5-41d9-8459-69aca636869d\") " pod="openshift-infra/auto-csr-approver-29564026-srzj5" Mar 18 13:46:00 crc kubenswrapper[4975]: I0318 13:46:00.386427 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrfs9\" (UniqueName: \"kubernetes.io/projected/329a8d0d-94f5-41d9-8459-69aca636869d-kube-api-access-wrfs9\") pod \"auto-csr-approver-29564026-srzj5\" (UID: \"329a8d0d-94f5-41d9-8459-69aca636869d\") " pod="openshift-infra/auto-csr-approver-29564026-srzj5" Mar 18 13:46:00 crc kubenswrapper[4975]: I0318 13:46:00.475533 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-srzj5" Mar 18 13:46:00 crc kubenswrapper[4975]: I0318 13:46:00.949664 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564026-srzj5"] Mar 18 13:46:01 crc kubenswrapper[4975]: I0318 13:46:01.967372 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564026-srzj5" event={"ID":"329a8d0d-94f5-41d9-8459-69aca636869d","Type":"ContainerStarted","Data":"30e571fc3de7545c2c0d3e428d03559682e702a0f56942892b810d013fd41c8b"} Mar 18 13:46:02 crc kubenswrapper[4975]: I0318 13:46:02.976496 4975 generic.go:334] "Generic (PLEG): container finished" podID="329a8d0d-94f5-41d9-8459-69aca636869d" containerID="08198ccc2691cdcb6b16cceb6214dc9a2a3e31b67e14e71b178e105a6ea86b30" exitCode=0 Mar 18 13:46:02 crc kubenswrapper[4975]: I0318 13:46:02.976590 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564026-srzj5" event={"ID":"329a8d0d-94f5-41d9-8459-69aca636869d","Type":"ContainerDied","Data":"08198ccc2691cdcb6b16cceb6214dc9a2a3e31b67e14e71b178e105a6ea86b30"} Mar 18 13:46:04 crc kubenswrapper[4975]: I0318 13:46:04.323447 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-srzj5" Mar 18 13:46:04 crc kubenswrapper[4975]: I0318 13:46:04.442600 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrfs9\" (UniqueName: \"kubernetes.io/projected/329a8d0d-94f5-41d9-8459-69aca636869d-kube-api-access-wrfs9\") pod \"329a8d0d-94f5-41d9-8459-69aca636869d\" (UID: \"329a8d0d-94f5-41d9-8459-69aca636869d\") " Mar 18 13:46:04 crc kubenswrapper[4975]: I0318 13:46:04.450304 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329a8d0d-94f5-41d9-8459-69aca636869d-kube-api-access-wrfs9" (OuterVolumeSpecName: "kube-api-access-wrfs9") pod "329a8d0d-94f5-41d9-8459-69aca636869d" (UID: "329a8d0d-94f5-41d9-8459-69aca636869d"). InnerVolumeSpecName "kube-api-access-wrfs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:46:04 crc kubenswrapper[4975]: I0318 13:46:04.545948 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrfs9\" (UniqueName: \"kubernetes.io/projected/329a8d0d-94f5-41d9-8459-69aca636869d-kube-api-access-wrfs9\") on node \"crc\" DevicePath \"\"" Mar 18 13:46:04 crc kubenswrapper[4975]: I0318 13:46:04.999410 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564026-srzj5" event={"ID":"329a8d0d-94f5-41d9-8459-69aca636869d","Type":"ContainerDied","Data":"30e571fc3de7545c2c0d3e428d03559682e702a0f56942892b810d013fd41c8b"} Mar 18 13:46:04 crc kubenswrapper[4975]: I0318 13:46:04.999462 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30e571fc3de7545c2c0d3e428d03559682e702a0f56942892b810d013fd41c8b" Mar 18 13:46:04 crc kubenswrapper[4975]: I0318 13:46:04.999562 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564026-srzj5" Mar 18 13:46:05 crc kubenswrapper[4975]: I0318 13:46:05.397964 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-wlcfg"] Mar 18 13:46:05 crc kubenswrapper[4975]: I0318 13:46:05.408569 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564020-wlcfg"] Mar 18 13:46:07 crc kubenswrapper[4975]: I0318 13:46:07.027260 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4b80ae8-52f0-493d-a760-3b1304bb849a" path="/var/lib/kubelet/pods/a4b80ae8-52f0-493d-a760-3b1304bb849a/volumes" Mar 18 13:46:10 crc kubenswrapper[4975]: I0318 13:46:10.016945 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:46:10 crc kubenswrapper[4975]: E0318 13:46:10.017762 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:46:22 crc kubenswrapper[4975]: I0318 13:46:22.016249 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:46:22 crc kubenswrapper[4975]: E0318 13:46:22.017197 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:46:33 crc kubenswrapper[4975]: I0318 13:46:33.016376 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:46:33 crc kubenswrapper[4975]: E0318 13:46:33.017420 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:46:44 crc kubenswrapper[4975]: I0318 13:46:44.016651 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:46:44 crc kubenswrapper[4975]: E0318 13:46:44.017600 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:46:54 crc kubenswrapper[4975]: I0318 13:46:54.818631 4975 scope.go:117] "RemoveContainer" containerID="b955af01730215f1717705878df49daa195f8bae192cc70dc8a138723ac71e69" Mar 18 13:46:55 crc kubenswrapper[4975]: I0318 13:46:55.023278 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:46:55 crc kubenswrapper[4975]: E0318 13:46:55.023503 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:47:10 crc kubenswrapper[4975]: I0318 13:47:10.016298 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:47:10 crc kubenswrapper[4975]: E0318 13:47:10.017011 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:47:21 crc kubenswrapper[4975]: I0318 13:47:21.016021 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:47:21 crc kubenswrapper[4975]: E0318 13:47:21.016818 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:47:36 crc kubenswrapper[4975]: I0318 13:47:36.016518 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:47:36 crc kubenswrapper[4975]: E0318 13:47:36.017308 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:47:49 crc kubenswrapper[4975]: I0318 13:47:49.016355 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:47:49 crc kubenswrapper[4975]: E0318 13:47:49.018741 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:48:00 crc kubenswrapper[4975]: I0318 13:48:00.151963 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564028-txlrh"] Mar 18 13:48:00 crc kubenswrapper[4975]: E0318 13:48:00.153212 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329a8d0d-94f5-41d9-8459-69aca636869d" containerName="oc" Mar 18 13:48:00 crc kubenswrapper[4975]: I0318 13:48:00.153235 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="329a8d0d-94f5-41d9-8459-69aca636869d" containerName="oc" Mar 18 13:48:00 crc kubenswrapper[4975]: I0318 13:48:00.153617 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="329a8d0d-94f5-41d9-8459-69aca636869d" containerName="oc" Mar 18 13:48:00 crc kubenswrapper[4975]: I0318 13:48:00.154663 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-txlrh" Mar 18 13:48:00 crc kubenswrapper[4975]: I0318 13:48:00.157536 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:48:00 crc kubenswrapper[4975]: I0318 13:48:00.157861 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:48:00 crc kubenswrapper[4975]: I0318 13:48:00.158246 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:48:00 crc kubenswrapper[4975]: I0318 13:48:00.162556 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564028-txlrh"] Mar 18 13:48:00 crc kubenswrapper[4975]: I0318 13:48:00.200435 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcpcz\" (UniqueName: \"kubernetes.io/projected/a7138b4e-8a12-4d04-aab4-25fbd5080ea2-kube-api-access-gcpcz\") pod \"auto-csr-approver-29564028-txlrh\" (UID: \"a7138b4e-8a12-4d04-aab4-25fbd5080ea2\") " pod="openshift-infra/auto-csr-approver-29564028-txlrh" Mar 18 13:48:00 crc kubenswrapper[4975]: I0318 13:48:00.302169 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcpcz\" (UniqueName: \"kubernetes.io/projected/a7138b4e-8a12-4d04-aab4-25fbd5080ea2-kube-api-access-gcpcz\") pod \"auto-csr-approver-29564028-txlrh\" (UID: \"a7138b4e-8a12-4d04-aab4-25fbd5080ea2\") " pod="openshift-infra/auto-csr-approver-29564028-txlrh" Mar 18 13:48:00 crc kubenswrapper[4975]: I0318 13:48:00.323096 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcpcz\" (UniqueName: \"kubernetes.io/projected/a7138b4e-8a12-4d04-aab4-25fbd5080ea2-kube-api-access-gcpcz\") pod \"auto-csr-approver-29564028-txlrh\" (UID: \"a7138b4e-8a12-4d04-aab4-25fbd5080ea2\") " pod="openshift-infra/auto-csr-approver-29564028-txlrh" Mar 18 13:48:00 crc kubenswrapper[4975]: I0318 13:48:00.476770 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-txlrh" Mar 18 13:48:00 crc kubenswrapper[4975]: I0318 13:48:00.907993 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564028-txlrh"] Mar 18 13:48:01 crc kubenswrapper[4975]: I0318 13:48:01.059409 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564028-txlrh" event={"ID":"a7138b4e-8a12-4d04-aab4-25fbd5080ea2","Type":"ContainerStarted","Data":"b418ced9026683e3fa265f22767016a6631e0503c3b8b0910d7fc080a9a3826e"} Mar 18 13:48:03 crc kubenswrapper[4975]: I0318 13:48:03.015989 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:48:03 crc kubenswrapper[4975]: E0318 13:48:03.016753 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:48:03 crc kubenswrapper[4975]: I0318 13:48:03.080434 4975 generic.go:334] "Generic (PLEG): container finished" podID="a7138b4e-8a12-4d04-aab4-25fbd5080ea2" containerID="bf31218d162e3e1e48ff3027fc8a5e0173191fa8eca405d28a9f58df4375bdaa" exitCode=0 Mar 18 13:48:03 crc kubenswrapper[4975]: I0318 13:48:03.080481 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564028-txlrh" event={"ID":"a7138b4e-8a12-4d04-aab4-25fbd5080ea2","Type":"ContainerDied","Data":"bf31218d162e3e1e48ff3027fc8a5e0173191fa8eca405d28a9f58df4375bdaa"} Mar 18 13:48:04 crc kubenswrapper[4975]: I0318 13:48:04.411504 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-txlrh" Mar 18 13:48:04 crc kubenswrapper[4975]: I0318 13:48:04.587498 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcpcz\" (UniqueName: \"kubernetes.io/projected/a7138b4e-8a12-4d04-aab4-25fbd5080ea2-kube-api-access-gcpcz\") pod \"a7138b4e-8a12-4d04-aab4-25fbd5080ea2\" (UID: \"a7138b4e-8a12-4d04-aab4-25fbd5080ea2\") " Mar 18 13:48:04 crc kubenswrapper[4975]: I0318 13:48:04.592913 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7138b4e-8a12-4d04-aab4-25fbd5080ea2-kube-api-access-gcpcz" (OuterVolumeSpecName: "kube-api-access-gcpcz") pod "a7138b4e-8a12-4d04-aab4-25fbd5080ea2" (UID: "a7138b4e-8a12-4d04-aab4-25fbd5080ea2"). InnerVolumeSpecName "kube-api-access-gcpcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:48:04 crc kubenswrapper[4975]: I0318 13:48:04.690682 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcpcz\" (UniqueName: \"kubernetes.io/projected/a7138b4e-8a12-4d04-aab4-25fbd5080ea2-kube-api-access-gcpcz\") on node \"crc\" DevicePath \"\"" Mar 18 13:48:05 crc kubenswrapper[4975]: I0318 13:48:05.101632 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564028-txlrh" event={"ID":"a7138b4e-8a12-4d04-aab4-25fbd5080ea2","Type":"ContainerDied","Data":"b418ced9026683e3fa265f22767016a6631e0503c3b8b0910d7fc080a9a3826e"} Mar 18 13:48:05 crc kubenswrapper[4975]: I0318 13:48:05.101687 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b418ced9026683e3fa265f22767016a6631e0503c3b8b0910d7fc080a9a3826e" Mar 18 13:48:05 crc kubenswrapper[4975]: I0318 13:48:05.101712 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564028-txlrh" Mar 18 13:48:05 crc kubenswrapper[4975]: I0318 13:48:05.481525 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-5cd8v"] Mar 18 13:48:05 crc kubenswrapper[4975]: I0318 13:48:05.489787 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564022-5cd8v"] Mar 18 13:48:07 crc kubenswrapper[4975]: I0318 13:48:07.042261 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d391f902-2a5a-4a37-8a5c-f3cc0279d5ae" path="/var/lib/kubelet/pods/d391f902-2a5a-4a37-8a5c-f3cc0279d5ae/volumes" Mar 18 13:48:16 crc kubenswrapper[4975]: I0318 13:48:16.017000 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:48:16 crc kubenswrapper[4975]: E0318 13:48:16.017887 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:48:28 crc kubenswrapper[4975]: I0318 13:48:28.016836 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:48:28 crc kubenswrapper[4975]: E0318 13:48:28.018008 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:48:41 crc kubenswrapper[4975]: I0318 13:48:41.016210 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:48:41 crc kubenswrapper[4975]: E0318 13:48:41.016880 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:48:53 crc kubenswrapper[4975]: I0318 13:48:53.017240 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:48:53 crc kubenswrapper[4975]: E0318 13:48:53.018014 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:48:54 crc kubenswrapper[4975]: I0318 13:48:54.917407 4975 scope.go:117] "RemoveContainer" containerID="a3a04f5551a724b33d35bbf8ade1aad0a796de2fcbe84388f3e238ae72b743dd" Mar 18 13:49:06 crc kubenswrapper[4975]: I0318 13:49:06.016604 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:49:06 crc kubenswrapper[4975]: E0318 13:49:06.017421 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:49:19 crc kubenswrapper[4975]: I0318 13:49:19.016704 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:49:19 crc kubenswrapper[4975]: E0318 13:49:19.017845 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:49:32 crc kubenswrapper[4975]: I0318 13:49:32.016807 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:49:32 crc kubenswrapper[4975]: E0318 13:49:32.017552 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:49:46 crc kubenswrapper[4975]: I0318 13:49:46.017193 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:49:46 crc kubenswrapper[4975]: E0318 13:49:46.017967 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:50:00 crc kubenswrapper[4975]: I0318 13:50:00.139704 4975 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564030-cs87j"] Mar 18 13:50:00 crc kubenswrapper[4975]: E0318 13:50:00.140572 4975 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7138b4e-8a12-4d04-aab4-25fbd5080ea2" containerName="oc" Mar 18 13:50:00 crc kubenswrapper[4975]: I0318 13:50:00.140589 4975 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7138b4e-8a12-4d04-aab4-25fbd5080ea2" containerName="oc" Mar 18 13:50:00 crc kubenswrapper[4975]: I0318 13:50:00.140844 4975 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7138b4e-8a12-4d04-aab4-25fbd5080ea2" containerName="oc" Mar 18 13:50:00 crc kubenswrapper[4975]: I0318 13:50:00.141671 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564030-cs87j" Mar 18 13:50:00 crc kubenswrapper[4975]: I0318 13:50:00.143714 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 13:50:00 crc kubenswrapper[4975]: I0318 13:50:00.143751 4975 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-shkz8" Mar 18 13:50:00 crc kubenswrapper[4975]: I0318 13:50:00.144188 4975 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 13:50:00 crc kubenswrapper[4975]: I0318 13:50:00.154972 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564030-cs87j"] Mar 18 13:50:00 crc kubenswrapper[4975]: I0318 13:50:00.296612 4975 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmtzz\" (UniqueName: \"kubernetes.io/projected/06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0-kube-api-access-mmtzz\") pod \"auto-csr-approver-29564030-cs87j\" (UID: \"06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0\") " pod="openshift-infra/auto-csr-approver-29564030-cs87j" Mar 18 13:50:00 crc kubenswrapper[4975]: I0318 13:50:00.398969 4975 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmtzz\" (UniqueName: \"kubernetes.io/projected/06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0-kube-api-access-mmtzz\") pod \"auto-csr-approver-29564030-cs87j\" (UID: \"06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0\") " pod="openshift-infra/auto-csr-approver-29564030-cs87j" Mar 18 13:50:00 crc kubenswrapper[4975]: I0318 13:50:00.422457 4975 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmtzz\" (UniqueName: \"kubernetes.io/projected/06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0-kube-api-access-mmtzz\") pod \"auto-csr-approver-29564030-cs87j\" (UID: \"06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0\") " pod="openshift-infra/auto-csr-approver-29564030-cs87j" Mar 18 13:50:00 crc kubenswrapper[4975]: I0318 13:50:00.466461 4975 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564030-cs87j" Mar 18 13:50:00 crc kubenswrapper[4975]: I0318 13:50:00.918268 4975 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564030-cs87j"] Mar 18 13:50:00 crc kubenswrapper[4975]: I0318 13:50:00.926823 4975 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 13:50:01 crc kubenswrapper[4975]: I0318 13:50:01.016375 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:50:01 crc kubenswrapper[4975]: E0318 13:50:01.016624 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b" Mar 18 13:50:01 crc kubenswrapper[4975]: I0318 13:50:01.279623 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564030-cs87j" event={"ID":"06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0","Type":"ContainerStarted","Data":"03b12946f01942c504f0158f64d9aff020f422ab9f784efbd7aea51602a830c2"} Mar 18 13:50:02 crc kubenswrapper[4975]: I0318 13:50:02.292532 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564030-cs87j" event={"ID":"06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0","Type":"ContainerStarted","Data":"c42e83fac1de948ebcc61e0d1e4bf711d047ccfefb530450c91ee8d695124dbe"} Mar 18 13:50:02 crc kubenswrapper[4975]: I0318 13:50:02.309739 4975 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564030-cs87j" podStartSLOduration=1.24236565 podStartE2EDuration="2.309700456s" podCreationTimestamp="2026-03-18 13:50:00 +0000 UTC" firstStartedPulling="2026-03-18 13:50:00.926081742 +0000 UTC m=+5986.640482361" lastFinishedPulling="2026-03-18 13:50:01.993416588 +0000 UTC m=+5987.707817167" observedRunningTime="2026-03-18 13:50:02.309373747 +0000 UTC m=+5988.023774326" watchObservedRunningTime="2026-03-18 13:50:02.309700456 +0000 UTC m=+5988.024101035" Mar 18 13:50:03 crc kubenswrapper[4975]: I0318 13:50:03.323776 4975 generic.go:334] "Generic (PLEG): container finished" podID="06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0" containerID="c42e83fac1de948ebcc61e0d1e4bf711d047ccfefb530450c91ee8d695124dbe" exitCode=0 Mar 18 13:50:03 crc kubenswrapper[4975]: I0318 13:50:03.323815 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564030-cs87j" event={"ID":"06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0","Type":"ContainerDied","Data":"c42e83fac1de948ebcc61e0d1e4bf711d047ccfefb530450c91ee8d695124dbe"} Mar 18 13:50:04 crc kubenswrapper[4975]: I0318 13:50:04.669917 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564030-cs87j" Mar 18 13:50:04 crc kubenswrapper[4975]: I0318 13:50:04.676767 4975 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmtzz\" (UniqueName: \"kubernetes.io/projected/06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0-kube-api-access-mmtzz\") pod \"06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0\" (UID: \"06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0\") " Mar 18 13:50:04 crc kubenswrapper[4975]: I0318 13:50:04.685467 4975 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0-kube-api-access-mmtzz" (OuterVolumeSpecName: "kube-api-access-mmtzz") pod "06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0" (UID: "06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0"). InnerVolumeSpecName "kube-api-access-mmtzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 13:50:04 crc kubenswrapper[4975]: I0318 13:50:04.778647 4975 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmtzz\" (UniqueName: \"kubernetes.io/projected/06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0-kube-api-access-mmtzz\") on node \"crc\" DevicePath \"\"" Mar 18 13:50:05 crc kubenswrapper[4975]: I0318 13:50:05.350133 4975 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564030-cs87j" event={"ID":"06c32e5a-ba9d-4e8c-b1f7-f17181b0bda0","Type":"ContainerDied","Data":"03b12946f01942c504f0158f64d9aff020f422ab9f784efbd7aea51602a830c2"} Mar 18 13:50:05 crc kubenswrapper[4975]: I0318 13:50:05.350181 4975 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03b12946f01942c504f0158f64d9aff020f422ab9f784efbd7aea51602a830c2" Mar 18 13:50:05 crc kubenswrapper[4975]: I0318 13:50:05.350261 4975 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564030-cs87j" Mar 18 13:50:05 crc kubenswrapper[4975]: I0318 13:50:05.383612 4975 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564024-r7xsg"] Mar 18 13:50:05 crc kubenswrapper[4975]: I0318 13:50:05.394223 4975 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564024-r7xsg"] Mar 18 13:50:07 crc kubenswrapper[4975]: I0318 13:50:07.031529 4975 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eaab4a7-c1e6-4829-8a8f-ea44b17c7179" path="/var/lib/kubelet/pods/4eaab4a7-c1e6-4829-8a8f-ea44b17c7179/volumes" Mar 18 13:50:15 crc kubenswrapper[4975]: I0318 13:50:15.024103 4975 scope.go:117] "RemoveContainer" containerID="68fc7653133e311bce7ebf812edb4ed2f2ea161ea3b49916bf5f5dd47e2257b6" Mar 18 13:50:15 crc kubenswrapper[4975]: E0318 13:50:15.024809 4975 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvdzt_openshift-machine-config-operator(59dd8f35-75c5-42d7-b11a-06586d1d5a1b)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvdzt" podUID="59dd8f35-75c5-42d7-b11a-06586d1d5a1b"